Jump to content

Dense cloud shader (wdas cloud hype)


sergio

Recommended Posts

Hi

 

I've been playing around with the wdas data set cloud and I've noticed that we houdini users (who are not lucky enough to have access to shaders dveloped by giant companies) lack an overall dense (almost pyroclastic) cloud shader that shows the distinct features of puffy cumulus clouds. After a through searching both in web and odforce i've seen that a few people also inquired on the same subject but it was inconclusive. I've decided to use this post as an idea dump to possibly implement a new shader or track your opinion on the subject.  

 

During the searching i've realized that distinct features of dense clouds are achievable by either using a very high number of internal light bounces or by faking them. I have seen that dark edges are the most prominent feature of the dense clouds since the other effects as transmittance and high forward scattering are already studied and implemented in shaders. 

 

To assess the current state of volume rendering other then mantra i've done a couple tests with several renderers in maya and also tried the new terragen 4.2. The most beautiful (simulating a real cloud light) is the hyperion render. Terragen 4 clouds are also very realistic and detailed (http://terragen4.com/first-look/). Arnold seems to be the best commercial option out there and is very good in giving the edge darkening but fails at details in dark areas (which is highly visible in hyperion) but i think these areas can be compansated with additional ligths. Redshift render is blazing fast but no where near the realism. 

 

In houdini i have used pbr with 50 volume limit, 5x5 samples, volume quality at 2 with a standart volume cloud shader. I have just started to see dark edges but even rendering a very small portion of 1920x1080 image took me about 10 minutes ( 2xE5-2680 v3, 48 cores total) . For speed comparison, redshift was the fastest with a couple minutes, arnold took about 10 minutes, terragen is said to be around an hour for a very heavy cumulus cloud. No information about hyperion but since wdas has a 50.000 core farm it shouldn't be a problem. Mantra was the worst with a projected couple hours of total render. 

 

Below are the sources i found for future reference for myself 

 

 

 With the directions i gathered from Mike Lyndon's post i have started implementing the photon map approach. I have already implemented a henyey-greenstein phase function and added the contribution by radiance photons. Now i will try to implement the ideas presented in the thesis of Antoine, with high focus on cumulus cloud rendering (ch. 7).

Attached is the current state of shader (cloud_shader.rar)

 

I am also interested in GLSL implementations for viewport and cloud modelling tools but i guess this post wil be mostly about a shop type implementation. 

anyway here is a simple glsl implementation work and i am open to every feedback

https://forums.odforce.net/topic/32197-glsl-pipeline-in-houdini/?tab=comments#comment-190945

 

I am definitely not an expert on this subject, so all input, ideas, criticsm, and source is welcomed. Thank you.  

 

Hyperion Render

 

wdas_cloud_hyperion_render.png

(wdas_cloud_hyperion_render.png is Copyright 2017 Disney Enterprises, Inc. and are licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License )

 

Terragen 4 Render

terragen_4_render.tif

 

 

Houdini Pbr Render (the little part on top took 10 minutes to render so i left it unfinished)

pbr.jpeg

 

Arnold Render

arnold.jpeg

 

Redshift Render

redshift.jpg

Edited by sergio
added titles to images
  • Like 1
  • Thanks 1
Link to comment
Share on other sites

to get dark edges you need to use forward scatter in the shader and many bounces of light. even with 8 bounces you can see the effect of dark edges. the more the bounces the brighter it gets because it more likely that the ray bounces back into your eye. From what I can see the math is already there and working, but is a bit slow. 

Does redshift support multiple volume bounces? is it capable to simulate internal volume scattering?

Edited by MENOZ
Link to comment
Share on other sites

I think that slow part is what makes micropolygon rendering attractive, but for a realistic result a very complex shader is required. I know multiple scattering comes free with pbr but with a cost in time. 

 

And no redshift doesn't seem to have multiple bounces in volumes. I crancked up GI bounces and tried photon map but it has no effect. 

Link to comment
Share on other sites

I have been trying to archive the look with darkening myself, i tried to use the method presented in this presentation from Horizon Dawn using only directional light, because i hate long renderingtimes :P The image i have added take around 20 sec on a 1950X Threadripper to render.

More links

https://www.guerrilla-games.com/read/nubis-realtime-volumetric-cloudscapes-in-a-nutshell

 

My setup ain't perfect yet, will have to work some more one it, but it gets the darknening, but when i blend it with the normal render, allot of the details get lost. i can post the setup later, but i only tried to replicate what was in the Horizon Dawn presentation. All the information i know is there.

 

 

Normal Render

cloud_01.PNG

Powder effect, mentioned in presentation.

cloud_powder_01.PNG

Edited by Heileif
Link to comment
Share on other sites

Hi Eilef,

 

That image look fantastic for a 20 sec render. I've read that paper over and over and downloaded the full presentation. From what i can tell they have done many hacks to make the clouds look good. No idea why the edge darkening ("in-scatter" as pixar papers call it) makes details get lost, it should only be a probability multiplier.

 

I've recently read many light transport and monte-carlo path traceing papers and maybe path tracing is the only way. Since they have most physical approximation. This is the most recent paper from pixar: production volume rendering siggraph2017 course

 

Anyway, would love to see your setup. thanks for your input

Link to comment
Share on other sites

Hi Eilef, 

 

I've took a look at your file and seems that you are feeding the output of volume core shaders bsdf back into itself with a ramp. I think what horizon zero down real time clouds are about is that they do very little sampling of ray marching and try to approximate everything with a very few calls to shaders. 

 

Being offline, we have much more time then real time guys (but not that much apparently).

 

Anyway here is my implementation of the real time clouds. I've bumped up the sample count and there are no optimizations but as a proof of concept it works. 

dense_cloud_shader_v006.hipnc

 

TODO: Take cone samples towards light. Modify in_scatter contribution with depth and vertical probability. 

 

Here is a test render. just took a couple seconds. 

  nubis_cloud.thumb.jpg.a95512fa295c14c9158b34c5c8a1b32b.jpg

Link to comment
Share on other sites

Here's a simple but costly setup. Density is used to modulate the phase function so that it's highly forward scattering (values over 0.9) where the cloud volume is thin and slightly more diffuse (values closer to 0.7) where the cloud is more dense. Scattering lobe changing it's shape as the ray travels inside the cloud was one of the main observations done by Bouthors. My solution is a very simple mimic of the phenomenon but it already does a lot.

It has 64 orders of multiple scattering which might be more than enough. It also uses photon map to accelerate the light scattering. No Mie scattering LUT is used. Render time 3.5 hours.

It's not identical to the Hyperion image but certainly has some nice features emerging. Some parts look even better IMO. I've also tone mapped the image with an ACES LUT and tweaked the exposure a bit to preserve the whites.

 

EDIT: Oh by the way I haven't check how it looks when the sun is behind the cloud. There could be some extreme translucency that needs to be handled.

Cheers!

mantra_cloud_material.hip

wdas_cloud_mantra.JPG

Edited by Hazoc
Additional note.
  • Like 9
Link to comment
Share on other sites

Hi hazoc, 

Thank you for the hip file, now i know it is possible to achieve good results with mantra. Sure 3.5h for a single image for a single cloud looks too much for production but i think recent improvements in denoising algorithms may come handy.  I've been leaning towards realtime sloutions but i guess there is no escape from reality: We need path tracing for dense volumes. 

Actually the developers of hyperion also states that without using a denoiser render times are not suitable for a path tracer (WDAS ACM:TOG paper: https://www.yiningkarlli.com/projects/hyperiondesign.html)

 

I have no access to an optix denoiser atm but i have done a test with COP denoiser. Here is your shader rendered with sun coming from behind. 

 

IPR render time 1h.07m.

cloud.thumb.jpeg.def554ef383c36170c656085c13fde41.jpeg

  

 

Denoised with COP denoiser, cineon log with an S curve.  

 

cloud_denoised0000.thumb.jpeg.6a95a7525fe4d5a7ee6e9db6e6d89a9e.jpeg

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...
On 8/3/2018 at 11:26 AM, Hazoc said:

Here's a simple but costly setup. Density is used to modulate the phase function so that it's highly forward scattering (values over 0.9) where the cloud volume is thin and slightly more diffuse (values closer to 0.7) where the cloud is more dense. Scattering lobe changing it's shape as the ray travels inside the cloud was one of the main observations done by Bouthors. My solution is a very simple mimic of the phenomenon but it already does a lot.

It has 64 orders of multiple scattering which might be more than enough. It also uses photon map to accelerate the light scattering. No Mie scattering LUT is used. Render time 3.5 hours.

It's not identical to the Hyperion image but certainly has some nice features emerging. Some parts look even better IMO. I've also tone mapped the image with an ACES LUT and tweaked the exposure a bit to preserve the whites.

 

EDIT: Oh by the way I haven't check how it looks when the sun is behind the cloud. There could be some extreme translucency that needs to be handled.

Cheers!

mantra_cloud_material.hip

wdas_cloud_mantra.JPG

DAM! LOOKS GREAT!

Link to comment
Share on other sites

On 8/3/2018 at 1:49 PM, sergio said:

Hi hazoc, 

Thank you for the hip file, now i know it is possible to achieve good results with mantra. Sure 3.5h for a single image for a single cloud looks too much for production but i think recent improvements in denoising algorithms may come handy.  I've been leaning towards realtime sloutions but i guess there is no escape from reality: We need path tracing for dense volumes. 

Actually the developers of hyperion also states that without using a denoiser render times are not suitable for a path tracer (WDAS ACM:TOG paper: https://www.yiningkarlli.com/projects/hyperiondesign.html)

 

I have no access to an optix denoiser atm but i have done a test with COP denoiser. Here is your shader rendered with sun coming from behind. 

 

IPR render time 1h.07m.

cloud.thumb.jpeg.def554ef383c36170c656085c13fde41.jpeg

  

 

Denoised with COP denoiser, cineon log with an S curve.  

 

cloud_denoised0000.thumb.jpeg.6a95a7525fe4d5a7ee6e9db6e6d89a9e.jpeg

Denoise look really good :)

 

But lets hope Nvidia Optix Denoiser will come to Mantra next version, and have been trained to denoise clouds! :P

Link to comment
Share on other sites

  • 2 weeks later...

interesting topic on shading dense cloud , the above renders look great,

realized that in redshift latest build for h16.5 you had to unhide the volume type option and gave a stab at this , and a while ago i did render some cloud pattern with arnold and it was good as well, so gave this a shot with the default h cloud tool with some tweaks mainly to the volume shader  and render time was 40 sec , have the file attached as well for anyone interested 

cloud.png

cloud-rnd.hiplc

Link to comment
Share on other sites

  • 3 years later...

Hey guys i was looking through this chat and the files but i can for the life of me see what im doing to not get redshift to render I set up my file the same way an nothing is there a setting i need to setup up to get this to work?

Link to comment
Share on other sites

  • 5 weeks later...
On 12/20/2021 at 5:39 AM, whks said:

Hey guys i was looking through this chat and the files but i can for the life of me see what im doing to not get redshift to render I set up my file the same way an nothing is there a setting i need to setup up to get this to work?

Hey Eric, if you can upload your test file I can take a look. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...