Raymond Chua Posted August 29, 2009 Share Posted August 29, 2009 Hi, I wonder if anyone has seen this yet, but it is pretty awesome. It is done by krakatoa particle renderings. But Houdini can do these right? Come on! Enjoy, Raymond 1 Quote Link to comment Share on other sites More sharing options...
anamous Posted August 29, 2009 Share Posted August 29, 2009 Sweet, nice color palettes. But yeah, that's very much possible with Houdini/Mantra. Rendering millions of points is very straight forward and efficient. The trick is in getting the points to light up each other, achieve false scattering and self shadowing. Creating non-flickering shadow maps on millions of particles is a challenge as they consume insane amounts of memory or are too blurry. But a little trickery can take you far: - For shadows: convert your particles to a volume by using the particle fluid surface SOP followed by an IsoOffset, and perform the shadow map generation on the volume which is fairly efficient in Mantra. Then, render the particles while using the previously shadow map. There's actually an even more efficient way to rendering the mesh as a volume without using the IsoOffset, and I'll show this sometime in the next few weeks. - For self-lighting: delete 1 out of N particles (leaving you with a few thousand instead of millions), and then either instance lights inheriting their color and intensity from attributes on the particles, or write out the thousands of particles as a point cloud file, and at render time look up their color and use it to add a fake lighting to the currently shaded point. OR convert the thousand of particles to a volume, transfer their color/intensity onto the voxel fields, and then look up those variable at rendertime using the volume sample VOP. - For fake scattering: one quick way is once you're doing either point cloud or volume lookups as mentioned above, you can measure the distance between the currently shaded point and the last neighbour point along the vector to a light source (either an actual light source or the next light point in your point cloud). Use this distance to modulate the lighting - the longer the distance, the less light comes through. If you're using the method above to generate shadow maps, depending on the volume shader you're using to render the shadow map you might not even need to perform this step. It's all not as push-button straightforward as in, say, Krakatoa, but is beautifully flexible and offers amazing possibilities where Krakatoa currently has its limits. cheers, Abdelkareem 3 Quote Link to comment Share on other sites More sharing options...
pclaes Posted August 29, 2009 Share Posted August 29, 2009 Thanks for those tips, some interesting suggestions, especially regarding shadow map creation and scattering! Quote Link to comment Share on other sites More sharing options...
MENOZ Posted August 30, 2009 Share Posted August 30, 2009 (edited) from what I've read on krakatoa docs http://software.primefocusworld.com/software/support/krakatoa/high_particle_counts_tutorial.php , it use partitioning to generate millions of particles. It basically randomize the seed of particles position, velocity etc.. and writes them to disk. I didn't understood if It actually recalculate the simulaion or randomize the one you write o disk, but it says By analyzing the design of your Particle Flow events, you can change specific random number seeds and calculate additional particles to add to your rendering. So I think it recalculate the simulation for each pass.I think that for optimal result the particles would need to be resimulated each time(each pass, or partition), expecially for the collisions zones. I tried to think a setup where you decide the number of passes that you want,and automatically it recalculate the various passes. I cannot find a solution for calculate separately each pass, one at time. Is there a way? I'm very interested on how to use the volume shadows AND that part where you say convert the thousand of particles to a volume, transfer their color/intensity onto the voxel fields, and then look up those variable at rendertime using the volume sample VOP. could you be more clear about the procedure? thankyou! Edited August 30, 2009 by MENOZ Quote Link to comment Share on other sites More sharing options...
MENOZ Posted August 30, 2009 Share Posted August 30, 2009 I tried to write a depth map for a volume, and read it back using particles scattered. It seems to work very efficently! possible that i'm not missing anything? Quote Link to comment Share on other sites More sharing options...
Raymond Chua Posted August 30, 2009 Author Share Posted August 30, 2009 I tried to write a depth map for a volume, and read it back using particles scattered. It seems to work very efficently! possible that i'm not missing anything? Hey Menoz, How are you? I too believed that it is done by simulating a few passes. Each passes might contains millions of particles. I am currently trying to figure out the self-lighting and self shadowing part which anamous has mentioned earlier. If you don't mind, share your renders and workflows. I believe a lot of people will be interested in this. =) regards, Raymond Quote Link to comment Share on other sites More sharing options...
brianburke Posted August 30, 2009 Share Posted August 30, 2009 There's actually an even more efficient way to rendering the mesh as a volume without using the IsoOffset, and I'll show this sometime in the next few weeks. Yes please!! Thanks for the workflow tips. Quote Link to comment Share on other sites More sharing options...
Solitude Posted August 30, 2009 Share Posted August 30, 2009 Hey guys... I'm a krakatoa / max user, and you guys are on the money with the partitioning thing. It has an automated (scripted) way of caching out those particles. It just changes the seeds, then saves out the particles, then repeats. While you can easily cache out say a million particles per partition (which might be needed) you can quickly get more particles by sending of 10 batches of 100,000 particles to the render farm instead. I think I saw a thread somewhere about automating this in houdini (though it may have been maya). I haven't played with particle enough yet in houdini, but in theory writing a simple python script to do cache out particles locally wouldn't be too hard, then you'd just have to script bringing all the sequences back in together to render at once. I know one person that is writing a prt exporter for houdini (he already did a csv exporter), so you can render in krakatoa if you wanted... there's some neat stuff you can do on prt loaders in krakatoa to bump up particle count without resimming too... like you can add a noise modifier to a reference copy of the prt loader, and tada you have twice as many particles rendering, slightly offset from each other. ...I'm sure there's a way to do that in Houdini too... I just don't know enough about it yet. Quote Link to comment Share on other sites More sharing options...
Andz Posted August 30, 2009 Share Posted August 30, 2009 (edited) from what I've read on krakatoa docs http://software.primefocusworld.com/software/support/krakatoa/high_particle_counts_tutorial.php , it use partitioning to generate millions of particles. It basically randomize the seed of particles position, velocity etc.. and writes them to disk. I didn't understood if It actually recalculate the simulaion or randomize the one you write o disk, but it says So I think it recalculate the simulation for each pass. The little i've seen and used in krakatoa, it was closer related to a renderer than a particle engine (i could be wrong, it was also a old version, probably beta). What makes it possible to render so many particles is that it doesn't render geometry only points. Because of that, one limitation was that you could not fly through a particle cloud or smoke, because the closer you'd get, the fewer points you'd see. Recently a new version came out, or is about to come out that is also able to render volumetrics and materials shading. There are other great images and animations in their gallery. Edited August 30, 2009 by Andz Quote Link to comment Share on other sites More sharing options...
Solitude Posted August 30, 2009 Share Posted August 30, 2009 (edited) Because of that, one limitation was that you could not fly through a particle cloud or smoke, because the closer you'd get, the fewer points you'd see. Yes, it was originally just a point renderer, but they recently added Voxel Rendering which effectively eliminates that limitation... at the cost of speed of course. You can still choose between particle or voxel rendering also, though it'd be cool if you could mix the two. EDIT: Oh, saw you mentioned the new version. =) Edited August 30, 2009 by Solitude Quote Link to comment Share on other sites More sharing options...
Andz Posted August 30, 2009 Share Posted August 30, 2009 You can still choose between particle or voxel rendering also, though it'd be cool if you could mix the two. EDIT: Oh, saw you mentioned the new version. =) Well, since it only renders the particles, you'd still have to comp it in the end. So, comp the points and voxels if you wish :-) Quote Link to comment Share on other sites More sharing options...
Solitude Posted August 31, 2009 Share Posted August 31, 2009 Well, since it only renders the particles, you'd still have to comp it in the end. So, comp the points and voxels if you wish :-) Yeah, that's exactly what I'm doing right now (literally right now), but the only issue is with overlapping particles, especially when we're talking about volumes worth of particles... have to get a little tricky sometimes to get it right without the particles just looking slapped on top of the voxels. Quote Link to comment Share on other sites More sharing options...
Andz Posted August 31, 2009 Share Posted August 31, 2009 Yeah, that's exactly what I'm doing right now (literally right now), but the only issue is with overlapping particles, especially when we're talking about volumes worth of particles... have to get a little tricky sometimes to get it right without the particles just looking slapped on top of the voxels. Please post the imagem/animtion if you're allowed. I'd love to see it. Quote Link to comment Share on other sites More sharing options...
Solitude Posted August 31, 2009 Share Posted August 31, 2009 Please post the imagem/animtion if you're allowed. I'd love to see it. I can't show it... but I'll let you know as soon it's available on the web... trailer should be out soon I think. It's not anything to get too excited over... it's nothing too crazy... though it is a lot of particles. Quote Link to comment Share on other sites More sharing options...
MENOZ Posted September 3, 2009 Share Posted September 3, 2009 Hello! I want to asy that I found a way to write do disk different simulaions. Really I don't know why I didn't thought about it. Simply using a Wedge rop you can render to disk multiple simulations varying any parameter. Then you can render the different "partitions" using maybe mantra delayed load. I know It can be done, now I will try and see which problems I find by handling high number of sequences (or partition, as krakatoa call them). just wanted to share Quote Link to comment Share on other sites More sharing options...
deecue Posted September 3, 2009 Share Posted September 3, 2009 yes, this would be the easy part.. to get out your random sequences, just simply write out the same particle sim with a wedge varying the seed of your popnet alone... the tricky part comes from the topics anamous has commented on: shadows, self-shadowing, and scattering.. on top of that, you're going to have to manage a lot of data and memory.. both in disk output (removal of some attrs may be necessary to keep size down) as well as memory management in the render itself (even with delayed load).. A six second sequence with ten 100,000 particle sims (1 mil total) might run you easily 10 gigs just for the geometry and standard attrs.. then the renders themselves could sky rocket in ram usage depending on everything else going on. and that's only for a million particles. so then you really want to break down the efficiency of it all to make it a useful tool.. even if you go the route of rending each pass separately and comping them together in post, you're only faking the depth of each render placed on top of one another and won't get the proper interaction from sim to sim regarding shadowing and scattering.. interesting topic indeed.. Quote Link to comment Share on other sites More sharing options...
Ratman Posted September 3, 2009 Share Posted September 3, 2009 A six second sequence with ten 100,000 particle sims (1 mil total) might run you easily 10 gigs just for the geometry and standard attrs.. then the renders themselves could sky rocket in ram usage depending on everything else going on. and that's only for a million particles. Am I the only one that finds that a bit ridiculous? I've run PFlow sims emitting 40million particles on my i7 with 6GB of ram, and it took less than 5 minutes, and consumed no more than 1.5GB of ram. I know that SESI is working on the multithreading and all that good jazz, but I hope that it's the only priority right now, I don't think Houdini needs a new feature right now, just a hit of speed, and I'll be a happy camper. Quote Link to comment Share on other sites More sharing options...
deecue Posted September 3, 2009 Share Posted September 3, 2009 I've run PFlow sims emitting 40million particles on my i7 with 6GB of ram, and it took less than 5 minutes, and consumed no more than 1.5GB of ram. 10 gigs cached to disk (bgeo files, uncompressed).. not ram.. those 10 100,000 (mil) particles rendered in mantra fairly quick on my very old machine and took up ~700mb of mem.. i think mantra handled it fine and would probably tear through a ton more, you'll just want to manage it more carefully once you get in to 50 million particle range (esp regarding multiple point attr's).. Quote Link to comment Share on other sites More sharing options...
Solitude Posted September 3, 2009 Share Posted September 3, 2009 I just want to point out that this one PRT sequence (krakatoa cache) I have here is 20 X 350,000 particle partitions totaling ~7mil particles on any given frame for about 60 or so frames, and it's only taking up 13 gigs of disk space. One of the really cool things, and again I'm sure you can do this in Houdini is that you can make reference copies of this loader in the scene, and have more "partitions" rendering just by offsetting the original one with a noise modifier on the fly. In houdini it should be as simple as creating file node, output it to multiple different "noises" and merge them back together before a render. It'll be cool when Ratman writes that prt exporter / importer for houdini... might save some disk space Quote Link to comment Share on other sites More sharing options...
MENOZ Posted September 3, 2009 Share Posted September 3, 2009 hello. Anybody knows a quick way to render say, 20 partitions using mantra delayed load? For "quick way" I mean something different from create 20 (or more) mantra delayed load nodes and set each name of the partition to use ect.. Any ideas? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.