Jump to content
Sign in to follow this  
caskal

Volume density from texture?

Recommended Posts

caskal    76

Hello magicians,

I saw a great image by Lee Griggs the other day and I tried to replicate, here is the original effect

19679327_10159089920775604_5421049374039743397_o.thumb.jpg.c9508c89c6d71adbac81d455fba5624d.jpg

 

He did it with Arnold (i think in maya), by projecting a texture into a volume density, here is some explanation https://support.solidangle.com/pages/viewpage.action?pageId=55710284

 

I didnt know how to project a texture into density, so I projected a texture, the deleted by color and converted that into vdb > fog, but that way I don't get any depth, any ideas? I've read about a rest field that does something like UV on volumes but cant figure out how

Here are my attempts:

1.jpg.d57d3ebcbce80ab92ace812f0d81449e.jpg

2.thumb.jpg.5385658166eb3baac3542e8d74c85965.jpg

3.jpg.f3a68310a5d8b981272f5b11d229c255.jpg

 

Hip file attached,

Cheers!

 

volnoise3.hip

Share this post


Link to post
Share on other sites
caskal    76

Figured out, not sure if this is the best option but seems to work, scattered points and used a point cloud, I'm glad I finally understood the point cloud stuff :D

Hip attached in case someone finds useful

Cheers!

vden.jpg

voldensed.hip

  • Like 1

Share this post


Link to post
Share on other sites
ch3    24

There are many ways you can project onto volume. The rest field is one of them and as you mentioned it can been used as UVs. I tend to skip that step and directly use the P which has been fitted to a the bounding box of the desired projection.

It's easier to try that on a volumeVOP to begin with. Let's say you want to project along the Y axis between x and z values of -10 to 10. All you need to do it fit the x and z values within that range so you have a 0 to 1 and feed that to the UVs (st) of the texture node. You can even have a second object as input and automatically get its bounds to calculate your fit range. Now if you want the projection to be on an arbitrary axis, you will have to do some extra maths to rotate the P, project and rotate back within VOPs, or if it's easier, you can do it at the SOP level.

What is important to keep in mind, is that volumeVOP will operate on a voxel level and you will never get any sharper detail than the voxel size. But once you do this, you can easily transfer the same nodes/logic onto a volume shader, which operates on rendered samples, which means you can go as sharp as your texture. But of course if you move your camera away from your projection axis, the texture representation will get blurred along that axis.

 

But then again, that's just one approach and maybe there are other ways that may give you more control and better results.

  • Like 1

Share this post


Link to post
Share on other sites
caskal    76

@ch3 thanks for the info!, about the rest field, how do I add that manually? rest sop has nothing to do with it, right?, I readed that when you do a pyro sim the rest stuff can be added checking a box, but how do I add manually in the sop context, ie: if I create my own vdb?

About volume vop, I understand the half of the stuff :D but will play with Houdini until I get there, thanks again for the direction!

Cheers

Share this post


Link to post
Share on other sites
ch3    24

I may be wrong about the rest volume, but can't you just manually make 3 volumes one for each axis and use a volume wrange to populate the values like that?

@restX = @P.x;

@restY = @P.y;

@restZ = @P.z;

I believe this makes sense to use when you advect it together with density, so you can have a reference to a "distorted" coordinate to drive noises with. Otherwise using the above rest fields will be the same with using world space P in the shader (P transformed from screenspace to worldspace)

  • Like 1

Share this post


Link to post
Share on other sites
caskal    76
On 14/7/2017 at 0:49 PM, ch3 said:

I may be wrong about the rest volume, but can't you just manually make 3 volumes one for each axis and use a volume wrange to populate the values like that?

@restX = @P.x;

@restY = @P.y;

@restZ = @P.z;

I believe this makes sense to use when you advect it together with density, so you can have a reference to a "distorted" coordinate to drive noises with. Otherwise using the above rest fields will be the same with using world space P in the shader (P transformed from screenspace to worldspace)

Thank you! Will give a shot!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×