drossxyu Posted March 12, 2023 Share Posted March 12, 2023 Houdini 19.5.435 Rocky Linux release 8.6 HtoA 6.1.4.1 Arnold 7.1.4.1 Deadline 10.2 Our studio is working on a project that has two animal heads with full grooms. We've started to run into potential memory issues on our farm and are trying to find the most efficient way to render out our shots. It renders fine locally, both with and without progressive and was rendering fine on our farm up until recently. A little bit about the scene and the workflow : Each animal head is composed of 5-6 groom objects. For each shot, the grooms are run through a guide deform and select groom objects get run through a guide sim, which is cached. From here, we cache out each hair generate node as a bgeo sequence for lighting. The primary hairgen cache's for each animal are huge. A 100 frame sequence is 190.22 gigs (almost 2 gigs a frame). 76 million points. Attributes are minimal (Cd, curveu, P, v and width on points, clumpid, id and uv on prims). The rest of hair gen caches are much smaller and will render just fine on the farm -- but the primary ones must push it over the edge. Our lighting scenes contain file nodes that reference each of these .bgeo sequences for convenience. We've tried migrating the original hairgens into the lighting file to avoid hairgen caching but it became difficult to manage since each hairgen has a sop network that needs to be kept up to date. The groom itself is not very portable since it requires juggling many object level nodes and we've yet to determine a great way to pipeline it all. The other odd thing is that the specs on the farm nodes that are failing seem more than adequate and the log indicates that it has plenty to spare : Worker Name: xxxxx Version: v10.2.0.10 Release (3b87216c7) Operating System: Linux Machine User: xxxxxx IP Address: xxxxx MAC Address: xxxxxx CPU Architecture: x86_64 CPUs: 96 CPU Usage: 0% Memory Usage: 1.6 GB / 184.4 GB (0%) Free Disk Space: 26.002 GB Video Card: This is the memory error that we're getting in the same log of the arnold task : ======================================================= Error ======================================================= Error: OutOfMemoryException : Exception of type 'System.OutOfMemoryException' was thrown. at System.Text.StringBuilder.ToString() at Deadline.IO.PathMappingUtils.c(String bxl, String bxm, String bxn, String[] bxo, String[] bxp, IList`1 bxq, Boolean bxr, DataController bxs, GenericDelegate1`1 bxt, String bxu, OS bxv) at Deadline.IO.PathMappingUtils.CheckPathMappingInFileAndReplace(String inFileName, String outFileName, String forceSeperator, String[] stringsToReplace, String[] newStrings, DataController dataController, GenericDelegate1`1 logFunction, String regionID, Boolean readFileAsBytes) at Deadline.IO.PathMappingUtils.CheckPathMappingInFileAndReplace(String inFileName, String outFileName, String forceSeperator, String[] stringsToReplace, String[] newStrings, DataController dataController, GenericDelegate1`1 logFunction, String regionID) at Deadline.IO.PathMappingUtils.CheckPathMappingInFileAndReplace(String inFileName, String outFileName, String[] stringsToReplace, String[] newStrings, DataController dataController, GenericDelegate1`1 logFunction, String regionID) at Deadline.IO.PathMappingUtils.CheckPathMappingInFileAndReplaceSeparator(String inFileName, String outFileName, String separatorToReplace, String newSeparator, DataController dataController, GenericDelegate1`1 logFunction, String regionName) at Deadline.Scripting.RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator(String inFileName, String outFileName, String separatorToReplace, String newSeparator) (Python.Runtime.PythonException) File "/var/lib/Thinkbox/Deadline10/workers/render20166/plugins/640ddfe4df22acd4b3cdd006/Arnold.py", line 88, in RenderArgument RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator( filename, localFilename, "\\", "/" ) at Python.Runtime.Dispatcher.Dispatch(ArrayList args) at __FranticX_GenericDelegate0`1\[\[System_String\, System_Private_CoreLib\, Version=6_0_0_0\, Culture=neutral\, PublicKeyToken=7cec85d7bea7798e\]\]Dispatcher.Invoke() at FranticX.Processes.ManagedProcess.RenderArgument() at Deadline.Plugins.DeadlinePlugin.RenderArgument() at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit) at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel) Doe anyone have any insight into how to troubleshoot hair/fur/groom scenes like this? Is there a better way to optimize our groom caches for rendering with Arnold? If it's render locally on a lower spec machine than what we have on our farm -- does that make any sense? Cheers! Quote Link to comment Share on other sites More sharing options...
Atom Posted March 12, 2023 Share Posted March 12, 2023 You might want to double check the reported error. Quote CheckPathMappingInFileAndReplaceSeparator( filename, localFilename, "\\", "/" ) Looks like an OS pathing error to me. Although only 26BG of free disk space might also be the culprit..? Quote Link to comment Share on other sites More sharing options...
drossxyu Posted March 12, 2023 Author Share Posted March 12, 2023 I was able to run an older set of caches that aren't nearly as large per frame ( less hair, less points per hair) and it's working fine. The paths are the same format so despite that weird path mapping line -- it shouldn't be an issue. It seems to me that cache is just to huge to fit into RAM even though these are nodes w\ 184 gigs. Seems absurd given that it is just one piece of one character. The log indicates that it's hardly using any RAM at all. The disk space thing is indeed strange but how much should it need for a single frame? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.