Jump to content

How to emit particles from a displaced surface.


CSN0309

Recommended Posts

Hi~guys

Can i distribute particles from the surface that already been displaced by a displacement shader(VEX)? i already applied a displacement shader to my surface, but i want to emit particles from this displaced surface not from the original surface. Any suggestions? post-4466-126207249199_thumb.jpg

Thank you

Link to comment
Share on other sites

I suppose scattering points onto your surface and displacing them with vopsop using similar displacement patterns as your displacement shader does. Then using these displaced points as your particle source could do the trick. I can make you an example scene when I'm less busy.

Link to comment
Share on other sites

Could you use the vopsop as a way to create groups from which to birth your particles?

or conversely to group and delete those particles that are not displced?

Was this assigned to me ? Anyhow, sure it's possible and should be done, since all the non emiting points would be just useless geometry. This will also give you a better idea of the shape of the emiter.

Link to comment
Share on other sites

I'm with petz. The potential problem with displacing at the SOP level is that you won't get all the displacement detail that you get when displacing at render time.

Sure. The approximation is coarse compared to the pointcloud way. And it's nice to have all in one shader.

BTW is there any neat way to link shop side with vopsop in this case if some one was to use the sop method ? Instead of maintaining two separate yet pretty identical shading networks.

Link to comment
Share on other sites

BTW is there any neat way to link shop side with vopsop in this case if some one was to use the sop method ? Instead of maintaining two separate yet pretty identical shading networks.

No. If you have code though, some preprocessor tricks could probably be used to minimize the code duplication.

Link to comment
Share on other sites

Thank you Hazoc, i already made it by the method that you provide to me. but i have to maintain the two separated displacement shader as the same,it's not very convenience so. anyway thank you.

and thank you all to give me your suggestions :rolleyes:. thank you for your example petz, i'm studying....

Edited by CSN0309
Link to comment
Share on other sites

hey Petz, very cool example, thanks. I took a look at your scene and was trying to make it work with an animated displacement. I was thinking I could just put a '$F' into the name of the pcwrite1's filename parameter, but it says you cannot have channels which depend on time. So I thought I'd make a parameter to the shader to take the current frame, and then compose the string for the filename out of this.. but I'm a little stuck. I was looking for something like a 'sprintf' VOP but I can't find one. Any ideas ? Or is there a better way ?

thanks

-ranxx

file is attached

petz

Link to comment
Share on other sites

interesting use of pcwrite! And nice way of reloading the file sop in the ropnet!

It seems camera dependent though. (So rendering the pointcloud from a better positioned camera - that covers more surface will give you a better pointcloud.)

What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points.

Some other things I was wondering about?

- multiple cameras with multiple pointclouds merging them all together in one big pointcloud and performing some filtering would probably solve most scenarios.

- or perhaps rendering in a different space? like in uv-space?

- what about an adaptive pointcloud? more points where there is more detail.

Link to comment
Share on other sites

What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points.

I think disable vm_hidden on the mantra node(add it first) will do this.

Link to comment
Share on other sites

interesting use of pcwrite! And nice way of reloading the file sop in the ropnet!

It seems camera dependent though. (So rendering the pointcloud from a better positioned camera - that covers more surface will give you a better pointcloud.)

What would be the best way to get points on the entire surface? Is there perhaps a render property that will force the entire object to be rendered (like render backfaces or something similar)? Currently it seems only the polygons visible from the camera will generate points.

Some other things I was wondering about?

- multiple cameras with multiple pointclouds merging them all together in one big pointcloud and performing some filtering would probably solve most scenarios.

- or perhaps rendering in a different space? like in uv-space?

- what about an adaptive pointcloud? more points where there is more detail.

the mantra rop which is used for pointcloud creation has vm_hidden disabled. so it generates a pointcloud for the entire geometry instead of just the visible one. you can find it under the mantra render properties. in the example file you need to increase the shading rate to see the points on the entire surface.

to get an uniform distribution add scanline measuring to your dicing parameters.

but anyway, the pointcloud generation is camera and resolution dependent, since the points are based on micropolygons as long as you are using micropolygon rendering.

- what about an adaptive pointcloud? more points where there is more detail.

one posibility would be reducing the pointcloud based on curvature in an postprocess. the other which i can think about is using irradiance cache for generating the pointcloud. so you would get more points in areas with more detail. if you expose min - and max spacing you have great control how your pointcloud get generated. the problem is that one more step is needed for this.

example file is attached

disp_particles2.hip

Edited by petz
Link to comment
Share on other sites

  • 1 month later...

I'm sorry for this. I got lost. How did petz generate the point cloud file?

It's done in the shader. He extended it by adding a pcwrite.

So every part of the object that is being diced (every part of the object that is within the render region) will create a point cloud.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...