Jump to content

Simulation performance on a low-budget system


henrik.g

Recommended Posts

Hey everyone!

 

I have some issues simulating on my computer and also some ideas on how to resolve those, but I wanted to check back with you guys, whether this would be useful.

 

 

First of, I am on an

i7 @ 3,7 GHz

12 GB Ram

and a not to mention graphics card

 

So not the best point to start out at all I guess, but anyway...

 

So when I do very small simulations (Flip or pyro or smoke, does not matter) things start out just fine. So I start slowly increasing the division Size and at some point there seems to be a threshold at which things get suddenly suuuper slow.

 

It can be a change from 0.018 to 0.0175. Just as an example.

 

So I checked my Task Manager and while beforehand most of the time the CPU is working at 90 - 100% suddenly a lot of the time it is down to like 10% or even less.

Also my hard disk is working like crazy..

 

So I redirected my cache on a non system drive with 3 1.5 TB free space and it is perfectly defragmented (at least Defraggler says so).

 

But now of course Houdini writes its cache files and its files to one and the same disk, so it still is terribly slow.

 

 

Now I am thinking that the bottleneck seems to be a hard drive issue...

So would it be useful to get like an 60GB SSD for 50 Euros to just direct the cache over there?

Or should I invest in more RAM? And if so.. How much RAM is appropriate?

 

As the title says, I am on a budget and cant invest in a all new system at this point..

 

Any help would be very much appreciated!!

 

 

Link to comment
Share on other sites

The problem you are having is not related to your hard drive speed.   It is a problem with running out of ram.  As your sim size increases it needs more ram to calculate and once it maxes out the ram it will start swapping (using the hard drive as ram) which gets ridiculously slow.  Looking at your setup I would guess you would be limited to 32gb of ram.

 

I am also guessing that you are running on windows.  If you plan on doing a lot of simulations, you may want to consider switching to linux.  Running large simulations on windows, Houdini tends have problems offloading ram in windows.

 

So I would invest in as much ram as your machine can hold. 

Link to comment
Share on other sites

I am also guessing that you are running on windows.  If you plan on doing a lot of simulations, you may want to consider switching to linux.  Running large simulations on windows, Houdini tends have problems offloading ram in windows.

 

If you are talking about the new 13 memory allocator being broken in windws, I believe its fixed in later versions...

http://www.sidefx.com/index.php?option=com_forum&Itemid=172&page=viewtopic&p=141990#141990

If not then ignore :)

Link to comment
Share on other sites

I agree with the other posts, linux is a good way to go if you want to make the most out of your available ram. Windows will eat up 2GB just to run, although I've seen versions of windows such as windows server 2012 use far less ram than consumer level versions of windows. I'd recommend giving centos a try, or ubuntu. These distros can be run off a live cd or live usb so u can try them before you install. You could also try streamlining you current windows install if linux is out of the question, such as limiting unneceassry background services, etc. Theres lots of docs online.

Edited by rhussain
Link to comment
Share on other sites

I am also curios how to handle large simulation scene, so let imagine a scenario:

 

-we have 32 GB ram

-we are in less eating ram platform , let say linux

-we have a simulation scene with 300 frames long

-after 50 frame running simulation all 32 GB ram is filled up :o

 

so how is the real workflow and solution? 192 GB ram ? :blink:

-is it possible we run our simulation in 6 step each 50 frame ?  so in each step houdini knows it should start from last frame of last step ( step 2 starting from frame 50 for example) and in each step houdini knows to not calculating those frames before again.(for example in step 2 it starts calculation from frame 50)

I guess making an initial state for each last frames of each step is a story for itself.

 

I am not expert in simulation but I don't think spending more money on hardware will fix every issues, there should be a workflow for it, hope you discus about it and open it for me too.

 

Thanks

Edited by avak
Link to comment
Share on other sites

When running sims, it helps to really know what you are going to need.  For example if you are running a smoke simulation without fire, you can get rid of the fields you don't need.  Since the smoke shader is based off of density, you don't need the heat, temperature, and fuel.  And if you aren't using the vel and rest, you get rid of them as well.  This will save huge on disk space, and it does help with ram as well.  I did a big smoke plume and the difference was; with all fields: 2.68gb/frame and with just density: 138mb/frame.

 

Other things that help, write out your fuel for your sim first.  Also writing out any collision objects first.  

Link to comment
Share on other sites

Who's using the Primitive/Volumes/Taper option? It's meant to be used to match the camera frustum. keeping important volume details closer to camera.

Link to comment
Share on other sites

Hey,

 

while we are at it.

I have read a lot of times, that your Conatiner should never contain more Voxel than pixels rendered later on as final output. Because that would be inefficient.

Am I Correct here?

 

Would be the correct way to "calculate" this be:

 

Have a Container of 1m and it perfectly fills the "Screen Size".

Assuming I would render at a resolution of 500x  500 Pixels.

 

1 / 500 = 0.002 as smallest Division Size?

Would it be "that easy"?

 

Greetings! And many thanks in Advace!

Link to comment
Share on other sites

nothing in vfx is ever "easy" :)

it really depends on the sim. Denser pyroclastic smoke benefits from more voxels but factor in motion blur, film grain lighting conditions and other factors means you can get away with less. On the other hand, try rendering on 1x1 pixel sample in mantra compared to 4x4 (essentially rendering at 4x res) and it will look a lot better. So I'd forget that rule and test out you sim and make a case by case decision. 

Link to comment
Share on other sites

?

 

Rendering at 1x1 would look better than renderinat 3x3? or 4x4?

Why would that be?

 

I mean, I will of course give it a try!

 

 

And many thanks for the fast answer! I will keep that in mind!

Sorry I meant 4x4 would look better than 1x1 as you are over sampling the image. Same logic could apply to voxels in theory, so an absolute limit of 1 voxel per pixel dosent hold. You could go higher and you will get more quality, if needed. Not that im saying to do that! Jus go as low as you can get away with :) The more tricks you use to reduce your voxel count the better ;)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...