CSN0309 Posted December 29, 2009 Share Posted December 29, 2009 Hi~guys Can i distribute particles from the surface that already been displaced by a displacement shader(VEX)? i already applied a displacement shader to my surface, but i want to emit particles from this displaced surface not from the original surface. Any suggestions? Thank you Quote Link to comment Share on other sites More sharing options...
Hazoc Posted December 29, 2009 Share Posted December 29, 2009 I suppose scattering points onto your surface and displacing them with vopsop using similar displacement patterns as your displacement shader does. Then using these displaced points as your particle source could do the trick. I can make you an example scene when I'm less busy. Quote Link to comment Share on other sites More sharing options...
quinniusmaximus Posted December 29, 2009 Share Posted December 29, 2009 Could you use the vopsop as a way to create groups from which to birth your particles? or conversely to group and delete those particles that are not displced? Quote Link to comment Share on other sites More sharing options...
Hazoc Posted December 29, 2009 Share Posted December 29, 2009 Could you use the vopsop as a way to create groups from which to birth your particles? or conversely to group and delete those particles that are not displced? Was this assigned to me ? Anyhow, sure it's possible and should be done, since all the non emiting points would be just useless geometry. This will also give you a better idea of the shape of the emiter. Quote Link to comment Share on other sites More sharing options...
CSN0309 Posted December 29, 2009 Author Share Posted December 29, 2009 Thank your both~ Hazoc could you explain it more specifically? i'm waiting for your example, Quote Link to comment Share on other sites More sharing options...
petz Posted December 29, 2009 Share Posted December 29, 2009 within your shader you could write out a pointcloud of the displaced geometry and use it then as particle source. petz Quote Link to comment Share on other sites More sharing options...
edward Posted December 29, 2009 Share Posted December 29, 2009 I'm with petz. The potential problem with displacing at the SOP level is that you won't get all the displacement detail that you get when displacing at render time. Quote Link to comment Share on other sites More sharing options...
CSN0309 Posted December 29, 2009 Author Share Posted December 29, 2009 petz, can you give me an example about how to use pointcloud from my displacement shader in VEX network? i was thinking about i can use pointcloud to get the SSS effects, but i never thought i can use pointcloud to emit particles. Quote Link to comment Share on other sites More sharing options...
petz Posted December 29, 2009 Share Posted December 29, 2009 file is attached petz disp_particles1.hip Quote Link to comment Share on other sites More sharing options...
Hazoc Posted December 30, 2009 Share Posted December 30, 2009 I'm with petz. The potential problem with displacing at the SOP level is that you won't get all the displacement detail that you get when displacing at render time. Sure. The approximation is coarse compared to the pointcloud way. And it's nice to have all in one shader. BTW is there any neat way to link shop side with vopsop in this case if some one was to use the sop method ? Instead of maintaining two separate yet pretty identical shading networks. Quote Link to comment Share on other sites More sharing options...
edward Posted December 30, 2009 Share Posted December 30, 2009 BTW is there any neat way to link shop side with vopsop in this case if some one was to use the sop method ? Instead of maintaining two separate yet pretty identical shading networks. No. If you have code though, some preprocessor tricks could probably be used to minimize the code duplication. Quote Link to comment Share on other sites More sharing options...
CSN0309 Posted December 30, 2009 Author Share Posted December 30, 2009 (edited) Thank you Hazoc, i already made it by the method that you provide to me. but i have to maintain the two separated displacement shader as the same,it's not very convenience so. anyway thank you. and thank you all to give me your suggestions . thank you for your example petz, i'm studying.... Edited December 30, 2009 by CSN0309 Quote Link to comment Share on other sites More sharing options...
ranxerox Posted January 5, 2010 Share Posted January 5, 2010 hey Petz, very cool example, thanks. I took a look at your scene and was trying to make it work with an animated displacement. I was thinking I could just put a '$F' into the name of the pcwrite1's filename parameter, but it says you cannot have channels which depend on time. So I thought I'd make a parameter to the shader to take the current frame, and then compose the string for the filename out of this.. but I'm a little stuck. I was looking for something like a 'sprintf' VOP but I can't find one. Any ideas ? Or is there a better way ? thanks -ranxx file is attached petz Quote Link to comment Share on other sites More sharing options...
petz Posted January 5, 2010 Share Posted January 5, 2010 for using $F you need to feed a parameter vop into the filename of pcwrite. have a look at the attached file. petz disp_particles_anim.hip Quote Link to comment Share on other sites More sharing options...
ranxerox Posted January 5, 2010 Share Posted January 5, 2010 thanks a lot petz ! -ranxx for using $F you need to feed a parameter vop into the filename of pcwrite. have a look at the attached file. petz Quote Link to comment Share on other sites More sharing options...
pclaes Posted January 6, 2010 Share Posted January 6, 2010 interesting use of pcwrite! And nice way of reloading the file sop in the ropnet! It seems camera dependent though. (So rendering the pointcloud from a better positioned camera - that covers more surface will give you a better pointcloud.) What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points. Some other things I was wondering about? - multiple cameras with multiple pointclouds merging them all together in one big pointcloud and performing some filtering would probably solve most scenarios. - or perhaps rendering in a different space? like in uv-space? - what about an adaptive pointcloud? more points where there is more detail. Quote Link to comment Share on other sites More sharing options...
mawi Posted January 6, 2010 Share Posted January 6, 2010 What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points. I think disable vm_hidden on the mantra node(add it first) will do this. Quote Link to comment Share on other sites More sharing options...
petz Posted January 6, 2010 Share Posted January 6, 2010 (edited) interesting use of pcwrite! And nice way of reloading the file sop in the ropnet! It seems camera dependent though. (So rendering the pointcloud from a better positioned camera - that covers more surface will give you a better pointcloud.) What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points. Some other things I was wondering about? - multiple cameras with multiple pointclouds merging them all together in one big pointcloud and performing some filtering would probably solve most scenarios. - or perhaps rendering in a different space? like in uv-space? - what about an adaptive pointcloud? more points where there is more detail. the mantra rop which is used for pointcloud creation has vm_hidden disabled. so it generates a pointcloud for the entire geometry instead of just the visible one. you can find it under the mantra render properties. in the example file you need to increase the shading rate to see the points on the entire surface. to get an uniform distribution add scanline measuring to your dicing parameters. but anyway, the pointcloud generation is camera and resolution dependent, since the points are based on micropolygons as long as you are using micropolygon rendering. - what about an adaptive pointcloud? more points where there is more detail. one posibility would be reducing the pointcloud based on curvature in an postprocess. the other which i can think about is using irradiance cache for generating the pointcloud. so you would get more points in areas with more detail. if you expose min - and max spacing you have great control how your pointcloud get generated. the problem is that one more step is needed for this. example file is attached disp_particles2.hip Edited January 6, 2010 by petz Quote Link to comment Share on other sites More sharing options...
Raymond Chua Posted February 7, 2010 Share Posted February 7, 2010 I'm sorry for this. I got lost. How did petz generate the point cloud file? Quote Link to comment Share on other sites More sharing options...
pclaes Posted February 7, 2010 Share Posted February 7, 2010 I'm sorry for this. I got lost. How did petz generate the point cloud file? It's done in the shader. He extended it by adding a pcwrite. So every part of the object that is being diced (every part of the object that is within the render region) will create a point cloud. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.