Thanks, yeah I assumed it should've been only mantra but I got worried after seeing the numbers. Also, I didn't render from idf as I really don't know how to use it yet! It seems the problem has resolved itself though after the render was complete, image below. Very interesting article, will be going over this soon. thanks! Thanks, will also check out the memory hscript command you mentioned! Is the Alfred progress supposed to show the memory available to mantra as well? ==================================================================== So after the render finished, and this is the day after it seems everything has gone back to normal! (image attached)
Running Win 7 Professional x64. Houdini latest version 12.5.533. I've been rendering this simulation for some time now. Anyway, what I noticed is that Windows has allocated around 14GB (stays around there) to the Cached Memory. Now from what I've been told, this is standby memory and should be allocated to a program if it needs it. In my render settings, I set the max cache ratio to 0.9 so shouldn't it all be allocated to the houdini render? My CPU is maxing out the whole time but the RAM keeps floating around 14GB. My actual simulation (rop_geo) was around 6GB so even if I take that into account I should still have 8 GB left. I even closed almost everything down and still didn't see much change? I tried stopping my render, going to the Houdini Cache Manager and clearing it up but still had at least 13GB or so of cached physical mem. I'm not sure if it's Maya but even it's cache wouldn't even come close to 2GB when it's not open right? I cleared the After Effects cache files as well. Other than that, I only really run Nuke and I know it doesn't take up that much. http://i296.photobucket.com/albums/mm182/A_A_M_I_R/houdini_cache.jpg Any ideas?