andyhowell Posted March 21, 2022 Share Posted March 21, 2022 I'm working on the destruction of a very large poly count model. (Will be around 1-2 million points at least). I've found that the usual File cache bgeo.sc method is quick and easy to load back, with decent responsiveness. This of course is no good for Maya. I've looked at FBX export as UVs need to be maintained, and also alembic caching which ends up with huge file sizes, and usually brings Maya to a hault. I've also looked at caching everything through the bgeo.sc method, then splitting out smaller groups of geo from the simulation and caching those separately as alembics Just wondering if anyone can point me in the right direction on what the best practice is for this type of thing. Thanks in advance Quote Link to comment Share on other sites More sharing options...
sebkaine Posted March 21, 2022 Share Posted March 21, 2022 (edited) i would try to feed the render engine directly without doing any intermediate step inside maya. if you use arnold render, i recommand per frame alembic sequence and direct load with the arnold stand-in for geo, points, curves. https://docs.arnoldrenderer.com/display/A5AFMUG/An+Introduction+to+Stand-ins and arnold volume for vdb loading. https://docs.arnoldrenderer.com/display/A5AFMUG/Volume Edited March 21, 2022 by sebkaine Quote Link to comment Share on other sites More sharing options...
Sepu Posted March 22, 2022 Share Posted March 22, 2022 you can also try loading all this data through Bifrost Graph. 1 - 2 million will be easily readable by BF. Plus you get all attributes for shading or whatever you need to do with them. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.