Jump to content

BSDF Bonanza: GGX, Microfacets, Disney BRDF and More


Wolfwood

Recommended Posts

ahh I my Houdini was crashing constantly yesterday while trying test the disney mixer, I didn't take much note of the console errors though..I will try again when I get a moment, or just wait for a update then try again :)

 

First post has been updated with 1.1.1 versions of the OTL that works around the VEX bug.

Link to comment
Share on other sites

"Calculation of albedo needs some thought.  Currently the albedo returned is the normalization factor for the distribution function.  While this matches how phong() and blinn() are setup, it should instead return the full reflectivity over the hemisphere taking into account frensnel (and masking?)"

 

My guess is masking shouldn't be part of it. My reasoning is that it represents shadowed light, so the absence of light wouldn't be weighting whether there should be more or less diffuse?

In the mean time I suppose we could do a white furnace render of a sphere for reference and come up with some approximation to fit the separate microfacetless fresnel based on roughness, by raising/lowering it to some power + fit range or some crazy hack like that :)

 

I've been using this scene as part of my testing process and following the rule of `everything is 100% reflective at glancing angle", you can see how the diffuse goes over dark at glancing angle due to lack of microfacet fresnel to weight it. I'll try the hack this weekend.

 

post-1495-0-87281900-1408629948_thumb.jp

 

 

  • Like 1
Link to comment
Share on other sites

The Disney BSDF has an intentional remapping of the Masking-Shadowing roughness term that prevents full reflectivity at the grazing angles so I mimicked that. HOWEVER, just last week Bruce amended the implementation details to say that may have not been a good idea. I've already removed this restriction from our internal version and I'll push out a 1.2 shortly.

:D Its a noticeable difference. (For the better IMHO)

Link to comment
Share on other sites

I'm trying to wrap my head around the cvex_bsdf() functions by deciphering your stuff.

 

While going through the disney gtr2 eval and sample functions, I noticed you seem to have left out the 1/PI and 1/(alphaX*alphaY) scaling terms in the D component, as per eq 13 in the disney paper appendix B.

 

Looks like the pdf is divided by rho (which is alphaX*alphaY/2) in the sample function, but I would have expected to see the scaling happen in the eval value returned by the eval function. I suppose this is because I don't quite yet grasp the interplay between the sample and eval functions, but I wanted to make sure it wasn't an oversight in your part :)

The output looks fine so I don't expect it be an error, I just want to understand everything..

 

 

PS. your private message mailbox seems to be full and does not accept new messages

 

Link to comment
Share on other sites

The interplay between the eval/sample is a massive pain and instead of having a straight forward sample(), pdf() and eval() functions like in Arnold or PBRT, you have these inner dependent variables.  (For example, from what I can tell the eval export in the sample function is used for indirect bounces while the eval in the eval shader is used for direct.)

 

The stock Houdini BSDFs (phong, blinn, etc) use unnormalized distributions then outside of the bsdf use 1/luminance(albedo) to normalize the BSDF.  This has been the standard pattern I've seen in almost all shader setups and it works if your BSDF is a straight distribution (D).  It kind of falls apart though when doing a full microfacet model, which is what my note in the known issues eludes to.  Basically what I'd like to do is normalize the D inside the BSDF, then have the albedo function return exactly what its suppose to, the average reflectance over the surface.  But that would mean you shouldn't divide by luminance(albedo) as the BSDF would now be taking care of it.  (I had it setup this way originally but since it was a slight departure from the standard BSDFs I changed it to its current form and its been bugging me ever since.)

 

The pdf is divided by rho (aka albedo aka refl) in the sample function because that's the way mantra expects it.   "note that this differs from the pdf produced by the evaluation function by a factor of refl"

 

As for the Pi factor, the PBR engine is taking care of this for you indirectly.  (PDF values by definition should integrate to 1, not 2*PI which Mantra expects, so there is some definitely some trickery going on behind the scenes.)  While Mantra's end result is the same, the implementation is a bit different from other renderers so various factors need to be taken into account when porting from one to another.  For example, the diffuse eval() function in PBRT just returns your diffuse reflectance divided by Pi.  In Mantra's eval(), the Pi is left out and there is an additional cos() factor tossed in for good measure.  I think this is because the light transport equation is Lo = Li * fr * cos(theta_wi).  Where fr is your BSDF.  Mantra doesn't have that cos factor in pathtracing code so its up to you to do it inside the BSDF.  It doesn't mean Mantra is wrong since multiplication is associative but it does make porting harder as you have to understand the intent.

 

If you are inspecting the code to see how everything relates the microfacet_eval and microfacet_sample might be a bit better as the distribution aspects have been abstracted out and the bulk of the code is just handling the expectations of Mantra.

 

 

Ultimately I could be wrong and if there is a convincing argument to how something should be implemented I'm all ears.  Effectively I had an initial explanation from Mario, the cvex_bsdf help, and then a lot of trial and error and comparisons to other renderers.   :P

Edited by Wolfwood
  • Like 2
Link to comment
Share on other sites

Had a crack at faking a GGX albedo output. It takes IOR and Roughness inputs.

 

Here's a pair of balls, the one on the left is 1.2 GGX microfacet reflecting a white env ball, whereas the one on the right is my crazy fudgetastic hack. Matches pretty close with varying IOR and Roughness settings. Well... I expect its a lot more right than complementing diffuse lighting with the microfacetless fresnel as I'm currently doing.   

 

post-1495-0-05983600-1408900345_thumb.jp

 

OTL and hip, hardly any testing was done...

 

GGX_Fake_Albedo.otl

GGX_Fake_Albedo.hip

 

And I've just realized I matched against a BSDF that has a shadowing term, which I don't think was a good idea... Well I guess I can now find out whether it is or not by rendering something with reflections! :D

  • Like 1
Link to comment
Share on other sites

PDF values by definition should integrate to 1, not 2*PI which Mantra expects, so there is some definitely some trickery going on behind the scenes.

 

2*pi probably comes from the fact that Mantra expects probabilities expressed in hemispheres as well as other render engines do.

Surface area of a full sphere is 4*pi*r^2. Surface area of hemisphere above the shading point will be half of that, 2*pi*r^2.

Since it is unit hemisphere, we can ignore r^2. So it is just 2*pi.

Link to comment
Share on other sites

2*pi probably comes from the fact that Mantra expects probabilities expressed in hemispheres as well as other render engines do.

Surface area of a full sphere is 4*pi*r^2. Surface area of hemisphere above the shading point will be half of that, 2*pi*r^2.

Since it is unit hemisphere, we can ignore r^2. So it is just 2*pi.

 

Yulp yulp.  I just meant when you were solving for the D()'s normalization factor you need to solve for Int[D]==2*pi instead of Int[D]==1.  (Papers generally use ==1 where as mantra ==2pi)

Link to comment
Share on other sites

Hey Guys,

 

I'm keen to implment GGX into the standard 'Physicaly Based Specular' node.

 

The one thing I'm confused by is where the cvex bsdf lives:

$f = cvex_bsdf(
                   "oplib:/com.shadeops::Shop/ggx_eval::1.1?com.shadeops::Shop/ggx_eval::1.1",
                   "oplib:/com.shadeops::Shop/ggx_sample::1.1?com.shadeops::Shop/ggx_sample::1.1",
                   "label", $label,

If I was to add this to another node, how do I reference this path? I'm fairly new to the idea of cvex, so any extra info would be tops!

Thanks,

 

Nick

 

www.nickdeboar.com

Link to comment
Share on other sites

Hey Guys,

 

I'm keen to implment GGX into the standard 'Physicaly Based Specular' node.

 

The one thing I'm confused by is where the cvex bsdf lives:

$f = cvex_bsdf(
                   "oplib:/com.shadeops::Shop/ggx_eval::1.1?com.shadeops::Shop/ggx_eval::1.1",
                   "oplib:/com.shadeops::Shop/ggx_sample::1.1?com.shadeops::Shop/ggx_sample::1.1",
                   "label", $label,

If I was to add this to another node, how do I reference this path? I'm fairly new to the idea of cvex, so any extra info would be tops!

Thanks,

 

Nick

 

www.nickdeboar.com

 

The way I extracted the code is by first creating the cvex nodes with Python

hou.node("/shop").createNode("com.shadeops::ggx_eval::1.1.1")
hou.node("/shop").createNode("com.shadeops::ggx_sample::1.1.1")

Then you copy the code from the code section of these nodes into two different files ggx_eval.vfl and ggx_sample.vfl. You can clean up the code a bit by replacing the include parts with actual include statements:

#include <pbr.h>

Then you compile the files with

vcc ggx_eval.vfl
vcc ggx_sample.vfl

and place the .vex files somewhere in a vex/CVex folder that is known by Houdini (you have specified the according environment variables).

 

Now you can simply call them via

    $f = cvex_bsdf("ggx_eval","ggx_sample","label", $label, [...] )

This worked for me :)

 

If you have more questions about the process don't hesitate to ask.

Edited by dennis.albus
  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

forgive the noob question...

just going by the Houdini surface model...

how would one implement things like emission, refraction, all the other sss parameters (scattering, phase, point cloud etc)...

I've been using eetu's basic example and adding to it but getting stuck pretty fast...

Link to comment
Share on other sites

  • 4 weeks later...

Hey Dennis,

 

Quick question about pre compiled Cvex shaders. What if a new version of houdini/mantra comes out? Would I need to recompile them? Trying to work out how this fits into the pipeline at work.

 

It should work without a problem as long as the specific functionality you are using are not changing (which would mean you have to change the code and recompile anyway). As vex is compiled to bytecode and interpreted on runtime it is much more forgiving in that regard.

As for a new version of Houdini/Mantra you probably have separate repositories for your major versions anyway, so I don't see a problem there.

 

I'm just getting into more advanced vops/vex shading and am wondering is there a simple way to 'mix' bsdf in PBR?

You can use the Mix VOP to blend bsdfs (I feel like this answer is too obvious and you might mean something else :huh:  )

Edited by dennis.albus
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...