Jump to content

Toms "Learning Houdini" Gallery


Thomas Helzle

Recommended Posts

Over the last couple of weeks I started to learn Houdini, partly from tutorials by Entagma and others, partially by implementing things I worked on in Grasshopper for Rhino (which is single threaded and rather slow for larger assets) over into the Houdini world.

This one started out from the Entagma tutorial on differential growth but I then first extended it to avoid the text areas and today implemented a first version of a threading solution to make it look more like yarn:

LineGrowthThreads2.jpg

This one is similar to an older test I once did in Softimage XSI and Arnold, but this time with Houdini, exported as FBX and rendered with my favourite renderer Thea Render:

Shells_Copper.jpg

And the last one for today was a study on how one can visualise noise in Houdini. It uses a solver and the trails node. I found it hard to make it look really subtle in Mantra, so in the end I created poly wires and also exported it to Thea Render via FBX:

NoiseFlow.jpg

Thanks everybody on Odforce for knowledge and inspiration!!!

Cheers,

Tom

  • Like 9
Link to comment
Share on other sites

  • 2 weeks later...
On 10/12/2016 at 6:18 AM, kiko said:

Good stuff. Take a look at this guy https://vimeo.com/simonholmedal

Thanks, awesome stuff!

 

I did some more explorations on the pyro solver with lines, this time the lines are horizontal at first and then get repeated each frame and sucked up by the smoke:

SplineSmokeLines.jpg

 

This one was a test if I could get Mantras DOF to look smooth (same frame as above). With 64x64 pixel samples and four hours rendertime it finally looked good... (original size 2560x2560px)
Not sure what's wrong with the Ray Variance Antialiasing, but no matter what crazy numbers I enter, it doesn't get smooth ever. Either I'm missing something or it's just not doing it's job ...?  
For my line of work, the docs claim of "how noisy analog film is" does not have any relevance...

SplineSmokeLines2_DOF.jpg

 

And one from a different approach and simulation where I don't replicate the lines at all but just let them be sucked up, so I don't start from a single line but from a "carpet" of lines. This also needed 64x64 samples to solve the thin lines in the back, 6 hours rendering - and it marked the limit of my 32 GB RAM since it's getting so dense on top... ;-)

SplineSmokeLinesCarpet.jpg

Cheers,

Tom

Edited by Thomas Helzle
  • Like 6
Link to comment
Share on other sites

This one I posted in the Differential Line Growth thread already, but put it here as well:

CoralGrowth.jpg

Basically based on the Entagma Tutorial, but instead of a resample node it uses a remesh. To get that to break out of the 2D plane, I gave the initial ellipse a tiny bit of point jitter on the outside. From there it's lots of parameter tweaking...

Adding a polyextrude and a subdivision after the solver gives it a bit of thickness. Exported to Thea Render as .obj for rendering. Really needs some Nemo-like fish in there... :-)

Cheers,

Tom

  • Like 1
Link to comment
Share on other sites

Took the above one step further: In Rhinos Grasshopper I experimented a lot with "Shortest Walk" to get natural structures. This is the result of shortest walks from a point in the centre to all outer points. Rendered in Thea again:

CoralStructure.jpg

And a second rendering with a different look:

CoralStructure2.jpg

Cheers,

Tom

Edited by Thomas Helzle
  • Like 5
Link to comment
Share on other sites

And another version with the "leaf" non-extruded but combined with the veins: :D

VeinedLeave.jpg

I tried for hours to get something fast and good looking in Mantra, but estimated rendertime with PBR and 1 diffuse bounce was always something in the multiple-hour region (6 core i7 @ 4.1 Ghz). And the single SSS just doesn't look very good to me.
I find these very long rendertimes especially hindering in the development stage, where it's hard to get a feel for how the whole thing looks.
I learned a lot about Mantra in the meantime and it's really flexible and probably great for larger companies, but for my setup it's not really usable.

So in the end I once again exported to Thea Render  (~700 MB OBJ) where I have a preview I can get a feel from in a couple of seconds and the above unbiased render with DOF, 5 bounces and translucency in 25 minutes.
The render looked much better than the 3+ hour Mantra result after 10 Minutes already, I let it cook another 15 to get rid of some fine grain in the DOF and the more occluded areas.

I'm not so much bothered by the final rendertimes (only for animations) but by the time it takes to get a first complete image that is good enough to actually see what you are working on.
I used to work with Mental Ray in Softimage XSI a lot and back then, long rendertimes and endless parameter tweaking were the norm, but nowadays I find them harder and harder to stand.

This is Thea Render 10 seconds after hitting render, using the unbiased "Presto MC" engine with 5 bounces, translucency and DOF, one HDRI environment as light:

Thea.jpg

I can already tell perfectly where things are going and I see enough detail so I can fluidly optimise my shaders and settings.

This is Mantra 15 seconds after hitting render with 1 diffuse bounce, no DOF and single scattering with the same HDRI as environment light:

Mantra.jpg

This is Mantra after 4 Minutes:

Mantra4.jpg

I can at least start to imagine that there are veins and how the lighting looks...

So GPU (GTX 980 TI & GTX 660 TI) and CPU rendering together really make a major difference in turnaround times, especially for a freelancer with no renderfarm... ;-)

Cheers,

Tom

Edited by Thomas Helzle
  • Like 3
Link to comment
Share on other sites

  • 4 weeks later...

And some more work.

Here I experimented some more with shortest path and directing the growth via custom cost attributes, so that the branches do not go straight to the endpoints but prefer the centre:

DirectedGrowth.jpg

 

A similar test on a sphere with happy-accident-leaves that were the result of converting the triangulated mesh to nurbs curves. I also did put in sliders to control random deletion of polygons on the underlying mesh to get more interesting paths:

SphericalLeaves.jpg

 

Something completely different - a Reaction-Diffusion structure created in "Ready" (https://t.co/t6Br2BuxD4) and rendered as Displacement in Mantra:

ReactionDiffusionDisplaced.jpg

 

Experimenting with point advected lines again, this time trying a very stylised look in black and white. The image reacts badly to scaling in the browser so you may want to look at the original.

"The big Divide":

TheBigDivide.jpg

 

And finally a combination of Reda Borchardts chain link tutorial applied to some Differential Line Growth... ;-)

Chains.jpg

These are all rendered in Mantra.

Cheers,

Tom

  • Like 2
Link to comment
Share on other sites

Thanks Jason!

In theory I'm okay with Mantra. In the meantime I can see how powerful it is and how much ground it covers - no arguments there.
I still prefer the unbiased way of working and the physical correct way Theas materials look right out of the box.
Mantra is more like Mental Ray in that regard and needs a lot of tweaking and work to get to that point. I did that for many many years and could happily live without it... ;-)

My main problem is, that with an Indie license, I'm tied to one Mantra instance and it's simply too slow for me to do animations. The SmokeLines animation rendered more than four days, occupying my main machine, and I cheated on quality and length. The differential line growth chain above rendered 3.5 hours. Such an image would render in about 2-10 Minutes in Thea and would look better. Sure I could do DOF in post, but I prefer doing it all in one go if I can, especially with stills.
So for my experiments and learning, Mantra is okay and renders can run overnight, but I would be rather hard pressed in an actual job with a tight deadline and the equipment I have.
In the end it's down to funds - either get a full commercial license and a couple of render slaves or one extremely beefed up 500 core machine... ;-)
But so far I don't make any money with Houdini.

It's somehow funny that both the Octane and Redshift Demos didn't really convince me either...
Takua Render may become interesting in the future - the author hints on the possibility of a Houdini integration: http://www.yiningkarlli.com/projects/takuarender.html
AMDs ProRender is currently integrated into Rhino and C4D, I guess they have their hands full with that for the time being.

Anyway, sorry for my ramblings and thanks again for the encouragement! :-)

Cheers,

Tom

  • Like 1
Link to comment
Share on other sites

I only have one Mantra token - that thread is talking about simulation? Are you sure that's also a solution for rendering?
But I don't have a renderfarm anyway ;-)

After using Thea for pretty much all my work for about 5 years, I find biased rendering (which I used the 15+ years before that) a step backwards.
While Redshift makes more sense for Houdini than Octane, it's not my favourite solution.

Cheers,

Tom

Link to comment
Share on other sites

Seems you are right. I just can't find any coherent information on that mysterious "Houdini Engine" that is supposed to be free with Indie.
And the setup of network rendering looks like a major nightmare from the 90ies... ;-)

I'll see if I can find out more.

Cheers and thanks for the hint.

Tom

Link to comment
Share on other sites

Thank you for your comment about Redshift. I know that biased is bad :( but it is faster and has better UI and reads Houdini attributes. I am doing only motion design (no reality). Well, hard to find perfect solution for a good budget.

As regards Engine, you may have overlooked the dropdown when buying Indie, because the benefits of Engine are maybe not clear. The distributed rendering is quite mysterious for me too :) See the attachment. SideFX will add those Engine licences for free to your account? I believe.

" One Mantra token is included with each of the products listed above. "

engine.PNG

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...