sergio Posted July 18, 2018 Share Posted July 18, 2018 (edited) Hi I've been playing around with the wdas data set cloud and I've noticed that we houdini users (who are not lucky enough to have access to shaders dveloped by giant companies) lack an overall dense (almost pyroclastic) cloud shader that shows the distinct features of puffy cumulus clouds. After a through searching both in web and odforce i've seen that a few people also inquired on the same subject but it was inconclusive. I've decided to use this post as an idea dump to possibly implement a new shader or track your opinion on the subject. During the searching i've realized that distinct features of dense clouds are achievable by either using a very high number of internal light bounces or by faking them. I have seen that dark edges are the most prominent feature of the dense clouds since the other effects as transmittance and high forward scattering are already studied and implemented in shaders. To assess the current state of volume rendering other then mantra i've done a couple tests with several renderers in maya and also tried the new terragen 4.2. The most beautiful (simulating a real cloud light) is the hyperion render. Terragen 4 clouds are also very realistic and detailed (http://terragen4.com/first-look/). Arnold seems to be the best commercial option out there and is very good in giving the edge darkening but fails at details in dark areas (which is highly visible in hyperion) but i think these areas can be compansated with additional ligths. Redshift render is blazing fast but no where near the realism. In houdini i have used pbr with 50 volume limit, 5x5 samples, volume quality at 2 with a standart volume cloud shader. I have just started to see dark edges but even rendering a very small portion of 1920x1080 image took me about 10 minutes ( 2xE5-2680 v3, 48 cores total) . For speed comparison, redshift was the fastest with a couple minutes, arnold took about 10 minutes, terragen is said to be around an hour for a very heavy cumulus cloud. No information about hyperion but since wdas has a 50.000 core farm it shouldn't be a problem. Mantra was the worst with a projected couple hours of total render. Below are the sources i found for future reference for myself Oliver also asked about a cloud shader and Mike Lyndon says he has implemented a shader before (https://forums.odforce.net/topic/17900-cloud-shader/) . This is the most prominent one and is the one i will be implementing. The thesis of Antoine Bouthors that Mike says he has implemented (http://evasion.imag.fr/~Antoine.Bouthors/research/phd/thesis/thesis.pdf) . Beautiful mie scattering solution by Matt and the base for my shader (http://mattebb.com/weblog/rendering-clouds-with-the-mie-phase-function/) Odforce user Andrea was also interested in such topic and has some ideas in it (https://forums.odforce.net/topic/24831-cloud-shader/) Modelling aspect of clouds (https://forums.odforce.net/topic/12923-pyroclastic-noise-demystified/?page=2) Siggraph presentation by Magnus Wrenninge (http://magnuswrenninge.com/content/pubs/VolumetricMethodsInVisualEffects2010.pdf) Realtime clouds for horizon zero dawn by guerilla games. realtime but has really nice implementations (horizon zero dawn real time clouds) A hefty paper from Hyperion developer Yining Karl Li (https://blog.yiningkarlli.com/2017/07/spectral-and-decomposition-tracking.html) With the directions i gathered from Mike Lyndon's post i have started implementing the photon map approach. I have already implemented a henyey-greenstein phase function and added the contribution by radiance photons. Now i will try to implement the ideas presented in the thesis of Antoine, with high focus on cumulus cloud rendering (ch. 7). Attached is the current state of shader (cloud_shader.rar) I am also interested in GLSL implementations for viewport and cloud modelling tools but i guess this post wil be mostly about a shop type implementation. anyway here is a simple glsl implementation work and i am open to every feedback https://forums.odforce.net/topic/32197-glsl-pipeline-in-houdini/?tab=comments#comment-190945 I am definitely not an expert on this subject, so all input, ideas, criticsm, and source is welcomed. Thank you. Hyperion Render (wdas_cloud_hyperion_render.png is Copyright 2017 Disney Enterprises, Inc. and are licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License ) Terragen 4 Render terragen_4_render.tif Houdini Pbr Render (the little part on top took 10 minutes to render so i left it unfinished) Arnold Render Redshift Render Edited July 18, 2018 by sergio added titles to images 1 1 Quote Link to comment Share on other sites More sharing options...
sergio Posted July 18, 2018 Author Share Posted July 18, 2018 Just saw this https://twitter.com/Farmfield/status/1016104271610302464?s=19 I know it's just a hype. But a nice shader practice for myself. Quote Link to comment Share on other sites More sharing options...
MENOZ Posted July 18, 2018 Share Posted July 18, 2018 (edited) to get dark edges you need to use forward scatter in the shader and many bounces of light. even with 8 bounces you can see the effect of dark edges. the more the bounces the brighter it gets because it more likely that the ray bounces back into your eye. From what I can see the math is already there and working, but is a bit slow. Does redshift support multiple volume bounces? is it capable to simulate internal volume scattering? Edited July 18, 2018 by MENOZ Quote Link to comment Share on other sites More sharing options...
sergio Posted July 18, 2018 Author Share Posted July 18, 2018 I think that slow part is what makes micropolygon rendering attractive, but for a realistic result a very complex shader is required. I know multiple scattering comes free with pbr but with a cost in time. And no redshift doesn't seem to have multiple bounces in volumes. I crancked up GI bounces and tried photon map but it has no effect. Quote Link to comment Share on other sites More sharing options...
Heileif Posted July 23, 2018 Share Posted July 23, 2018 (edited) I have been trying to archive the look with darkening myself, i tried to use the method presented in this presentation from Horizon Dawn using only directional light, because i hate long renderingtimes The image i have added take around 20 sec on a 1950X Threadripper to render. More links https://www.guerrilla-games.com/read/nubis-realtime-volumetric-cloudscapes-in-a-nutshell My setup ain't perfect yet, will have to work some more one it, but it gets the darknening, but when i blend it with the normal render, allot of the details get lost. i can post the setup later, but i only tried to replicate what was in the Horizon Dawn presentation. All the information i know is there. Normal Render Powder effect, mentioned in presentation. Edited July 23, 2018 by Heileif Quote Link to comment Share on other sites More sharing options...
sergio Posted July 23, 2018 Author Share Posted July 23, 2018 Hi Eilef, That image look fantastic for a 20 sec render. I've read that paper over and over and downloaded the full presentation. From what i can tell they have done many hacks to make the clouds look good. No idea why the edge darkening ("in-scatter" as pixar papers call it) makes details get lost, it should only be a probability multiplier. I've recently read many light transport and monte-carlo path traceing papers and maybe path tracing is the only way. Since they have most physical approximation. This is the most recent paper from pixar: production volume rendering siggraph2017 course Anyway, would love to see your setup. thanks for your input Quote Link to comment Share on other sites More sharing options...
Heileif Posted July 25, 2018 Share Posted July 25, 2018 (edited) sorry for the late reply. I have made a cleaned up file for you all the stuff happens in the shader. Cloud_Shader_01_v016.hiplc Edited July 25, 2018 by Heileif Quote Link to comment Share on other sites More sharing options...
sergio Posted July 26, 2018 Author Share Posted July 26, 2018 Hi Eilef, I've took a look at your file and seems that you are feeding the output of volume core shaders bsdf back into itself with a ramp. I think what horizon zero down real time clouds are about is that they do very little sampling of ray marching and try to approximate everything with a very few calls to shaders. Being offline, we have much more time then real time guys (but not that much apparently). Anyway here is my implementation of the real time clouds. I've bumped up the sample count and there are no optimizations but as a proof of concept it works. dense_cloud_shader_v006.hipnc TODO: Take cone samples towards light. Modify in_scatter contribution with depth and vertical probability. Here is a test render. just took a couple seconds. Quote Link to comment Share on other sites More sharing options...
sergio Posted July 26, 2018 Author Share Posted July 26, 2018 Update: - Added cone sampling towards light - in_scatter uses depth and vertical probability - added sky color dense_cloud_shader_v007.hipnc Quote Link to comment Share on other sites More sharing options...
Hazoc Posted August 3, 2018 Share Posted August 3, 2018 (edited) Here's a simple but costly setup. Density is used to modulate the phase function so that it's highly forward scattering (values over 0.9) where the cloud volume is thin and slightly more diffuse (values closer to 0.7) where the cloud is more dense. Scattering lobe changing it's shape as the ray travels inside the cloud was one of the main observations done by Bouthors. My solution is a very simple mimic of the phenomenon but it already does a lot. It has 64 orders of multiple scattering which might be more than enough. It also uses photon map to accelerate the light scattering. No Mie scattering LUT is used. Render time 3.5 hours. It's not identical to the Hyperion image but certainly has some nice features emerging. Some parts look even better IMO. I've also tone mapped the image with an ACES LUT and tweaked the exposure a bit to preserve the whites. EDIT: Oh by the way I haven't check how it looks when the sun is behind the cloud. There could be some extreme translucency that needs to be handled. Cheers! mantra_cloud_material.hip Edited August 3, 2018 by Hazoc Additional note. 9 Quote Link to comment Share on other sites More sharing options...
sergio Posted August 3, 2018 Author Share Posted August 3, 2018 Hi hazoc, Thank you for the hip file, now i know it is possible to achieve good results with mantra. Sure 3.5h for a single image for a single cloud looks too much for production but i think recent improvements in denoising algorithms may come handy. I've been leaning towards realtime sloutions but i guess there is no escape from reality: We need path tracing for dense volumes. Actually the developers of hyperion also states that without using a denoiser render times are not suitable for a path tracer (WDAS ACM:TOG paper: https://www.yiningkarlli.com/projects/hyperiondesign.html) I have no access to an optix denoiser atm but i have done a test with COP denoiser. Here is your shader rendered with sun coming from behind. IPR render time 1h.07m. Denoised with COP denoiser, cineon log with an S curve. 1 Quote Link to comment Share on other sites More sharing options...
Heileif Posted August 14, 2018 Share Posted August 14, 2018 On 8/3/2018 at 11:26 AM, Hazoc said: Here's a simple but costly setup. Density is used to modulate the phase function so that it's highly forward scattering (values over 0.9) where the cloud volume is thin and slightly more diffuse (values closer to 0.7) where the cloud is more dense. Scattering lobe changing it's shape as the ray travels inside the cloud was one of the main observations done by Bouthors. My solution is a very simple mimic of the phenomenon but it already does a lot. It has 64 orders of multiple scattering which might be more than enough. It also uses photon map to accelerate the light scattering. No Mie scattering LUT is used. Render time 3.5 hours. It's not identical to the Hyperion image but certainly has some nice features emerging. Some parts look even better IMO. I've also tone mapped the image with an ACES LUT and tweaked the exposure a bit to preserve the whites. EDIT: Oh by the way I haven't check how it looks when the sun is behind the cloud. There could be some extreme translucency that needs to be handled. Cheers! mantra_cloud_material.hip DAM! LOOKS GREAT! Quote Link to comment Share on other sites More sharing options...
Heileif Posted August 16, 2018 Share Posted August 16, 2018 On 8/3/2018 at 1:49 PM, sergio said: Hi hazoc, Thank you for the hip file, now i know it is possible to achieve good results with mantra. Sure 3.5h for a single image for a single cloud looks too much for production but i think recent improvements in denoising algorithms may come handy. I've been leaning towards realtime sloutions but i guess there is no escape from reality: We need path tracing for dense volumes. Actually the developers of hyperion also states that without using a denoiser render times are not suitable for a path tracer (WDAS ACM:TOG paper: https://www.yiningkarlli.com/projects/hyperiondesign.html) I have no access to an optix denoiser atm but i have done a test with COP denoiser. Here is your shader rendered with sun coming from behind. IPR render time 1h.07m. Denoised with COP denoiser, cineon log with an S curve. Denoise look really good But lets hope Nvidia Optix Denoiser will come to Mantra next version, and have been trained to denoise clouds! Quote Link to comment Share on other sites More sharing options...
pxptpw Posted August 28, 2018 Share Posted August 28, 2018 interesting topic on shading dense cloud , the above renders look great, realized that in redshift latest build for h16.5 you had to unhide the volume type option and gave a stab at this , and a while ago i did render some cloud pattern with arnold and it was good as well, so gave this a shot with the default h cloud tool with some tweaks mainly to the volume shader and render time was 40 sec , have the file attached as well for anyone interested cloud-rnd.hiplc Quote Link to comment Share on other sites More sharing options...
whks Posted December 20, 2021 Share Posted December 20, 2021 Hey guys i was looking through this chat and the files but i can for the life of me see what im doing to not get redshift to render I set up my file the same way an nothing is there a setting i need to setup up to get this to work? Quote Link to comment Share on other sites More sharing options...
sergio Posted January 18, 2022 Author Share Posted January 18, 2022 On 12/20/2021 at 5:39 AM, whks said: Hey guys i was looking through this chat and the files but i can for the life of me see what im doing to not get redshift to render I set up my file the same way an nothing is there a setting i need to setup up to get this to work? Hey Eric, if you can upload your test file I can take a look. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.