Wolfwood Posted August 20, 2014 Author Share Posted August 20, 2014 ahh I my Houdini was crashing constantly yesterday while trying test the disney mixer, I didn't take much note of the console errors though..I will try again when I get a moment, or just wait for a update then try again First post has been updated with 1.1.1 versions of the OTL that works around the VEX bug. Quote Link to comment Share on other sites More sharing options...
Serg Posted August 21, 2014 Share Posted August 21, 2014 "Calculation of albedo needs some thought. Currently the albedo returned is the normalization factor for the distribution function. While this matches how phong() and blinn() are setup, it should instead return the full reflectivity over the hemisphere taking into account frensnel (and masking?)" My guess is masking shouldn't be part of it. My reasoning is that it represents shadowed light, so the absence of light wouldn't be weighting whether there should be more or less diffuse? In the mean time I suppose we could do a white furnace render of a sphere for reference and come up with some approximation to fit the separate microfacetless fresnel based on roughness, by raising/lowering it to some power + fit range or some crazy hack like that I've been using this scene as part of my testing process and following the rule of `everything is 100% reflective at glancing angle", you can see how the diffuse goes over dark at glancing angle due to lack of microfacet fresnel to weight it. I'll try the hack this weekend. 1 Quote Link to comment Share on other sites More sharing options...
Wolfwood Posted August 22, 2014 Author Share Posted August 22, 2014 The Disney BSDF has an intentional remapping of the Masking-Shadowing roughness term that prevents full reflectivity at the grazing angles so I mimicked that. HOWEVER, just last week Bruce amended the implementation details to say that may have not been a good idea. I've already removed this restriction from our internal version and I'll push out a 1.2 shortly. Its a noticeable difference. (For the better IMHO) Quote Link to comment Share on other sites More sharing options...
jordibares Posted August 22, 2014 Share Posted August 22, 2014 Looking forward to play with the new 1.2 version, I am right now swamped but want to test it properly. Quote Link to comment Share on other sites More sharing options...
Wolfwood Posted August 22, 2014 Author Share Posted August 22, 2014 (edited) Difference with between the remapped roughness in v1.1 (top) and the unmapped version in 1.2 (bottom). Added link to the first post. Edited August 22, 2014 by Wolfwood 1 Quote Link to comment Share on other sites More sharing options...
jordibares Posted August 22, 2014 Share Posted August 22, 2014 (edited) Awesome… thanks a lot Jim, the results of v1.2 are truly great. Edited August 22, 2014 by jordibares Quote Link to comment Share on other sites More sharing options...
eetu Posted August 23, 2014 Share Posted August 23, 2014 I'm trying to wrap my head around the cvex_bsdf() functions by deciphering your stuff. While going through the disney gtr2 eval and sample functions, I noticed you seem to have left out the 1/PI and 1/(alphaX*alphaY) scaling terms in the D component, as per eq 13 in the disney paper appendix B. Looks like the pdf is divided by rho (which is alphaX*alphaY/2) in the sample function, but I would have expected to see the scaling happen in the eval value returned by the eval function. I suppose this is because I don't quite yet grasp the interplay between the sample and eval functions, but I wanted to make sure it wasn't an oversight in your part The output looks fine so I don't expect it be an error, I just want to understand everything.. PS. your private message mailbox seems to be full and does not accept new messages Quote Link to comment Share on other sites More sharing options...
Wolfwood Posted August 23, 2014 Author Share Posted August 23, 2014 (edited) The interplay between the eval/sample is a massive pain and instead of having a straight forward sample(), pdf() and eval() functions like in Arnold or PBRT, you have these inner dependent variables. (For example, from what I can tell the eval export in the sample function is used for indirect bounces while the eval in the eval shader is used for direct.) The stock Houdini BSDFs (phong, blinn, etc) use unnormalized distributions then outside of the bsdf use 1/luminance(albedo) to normalize the BSDF. This has been the standard pattern I've seen in almost all shader setups and it works if your BSDF is a straight distribution (D). It kind of falls apart though when doing a full microfacet model, which is what my note in the known issues eludes to. Basically what I'd like to do is normalize the D inside the BSDF, then have the albedo function return exactly what its suppose to, the average reflectance over the surface. But that would mean you shouldn't divide by luminance(albedo) as the BSDF would now be taking care of it. (I had it setup this way originally but since it was a slight departure from the standard BSDFs I changed it to its current form and its been bugging me ever since.) The pdf is divided by rho (aka albedo aka refl) in the sample function because that's the way mantra expects it. "note that this differs from the pdf produced by the evaluation function by a factor of refl" As for the Pi factor, the PBR engine is taking care of this for you indirectly. (PDF values by definition should integrate to 1, not 2*PI which Mantra expects, so there is some definitely some trickery going on behind the scenes.) While Mantra's end result is the same, the implementation is a bit different from other renderers so various factors need to be taken into account when porting from one to another. For example, the diffuse eval() function in PBRT just returns your diffuse reflectance divided by Pi. In Mantra's eval(), the Pi is left out and there is an additional cos() factor tossed in for good measure. I think this is because the light transport equation is Lo = Li * fr * cos(theta_wi). Where fr is your BSDF. Mantra doesn't have that cos factor in pathtracing code so its up to you to do it inside the BSDF. It doesn't mean Mantra is wrong since multiplication is associative but it does make porting harder as you have to understand the intent. If you are inspecting the code to see how everything relates the microfacet_eval and microfacet_sample might be a bit better as the distribution aspects have been abstracted out and the bulk of the code is just handling the expectations of Mantra. Ultimately I could be wrong and if there is a convincing argument to how something should be implemented I'm all ears. Effectively I had an initial explanation from Mario, the cvex_bsdf help, and then a lot of trial and error and comparisons to other renderers. Edited August 23, 2014 by Wolfwood 2 Quote Link to comment Share on other sites More sharing options...
Serg Posted August 24, 2014 Share Posted August 24, 2014 Had a crack at faking a GGX albedo output. It takes IOR and Roughness inputs. Here's a pair of balls, the one on the left is 1.2 GGX microfacet reflecting a white env ball, whereas the one on the right is my crazy fudgetastic hack. Matches pretty close with varying IOR and Roughness settings. Well... I expect its a lot more right than complementing diffuse lighting with the microfacetless fresnel as I'm currently doing. OTL and hip, hardly any testing was done... GGX_Fake_Albedo.otl GGX_Fake_Albedo.hip And I've just realized I matched against a BSDF that has a shadowing term, which I don't think was a good idea... Well I guess I can now find out whether it is or not by rendering something with reflections! 1 Quote Link to comment Share on other sites More sharing options...
ssh Posted August 26, 2014 Share Posted August 26, 2014 PDF values by definition should integrate to 1, not 2*PI which Mantra expects, so there is some definitely some trickery going on behind the scenes. 2*pi probably comes from the fact that Mantra expects probabilities expressed in hemispheres as well as other render engines do. Surface area of a full sphere is 4*pi*r^2. Surface area of hemisphere above the shading point will be half of that, 2*pi*r^2. Since it is unit hemisphere, we can ignore r^2. So it is just 2*pi. Quote Link to comment Share on other sites More sharing options...
Wolfwood Posted August 26, 2014 Author Share Posted August 26, 2014 2*pi probably comes from the fact that Mantra expects probabilities expressed in hemispheres as well as other render engines do. Surface area of a full sphere is 4*pi*r^2. Surface area of hemisphere above the shading point will be half of that, 2*pi*r^2. Since it is unit hemisphere, we can ignore r^2. So it is just 2*pi. Yulp yulp. I just meant when you were solving for the D()'s normalization factor you need to solve for Int[D]==2*pi instead of Int[D]==1. (Papers generally use ==1 where as mantra ==2pi) Quote Link to comment Share on other sites More sharing options...
thekenny Posted August 28, 2014 Share Posted August 28, 2014 Jim, You are still wicked dangerous. Thanks for sharing! -k Quote Link to comment Share on other sites More sharing options...
stevegh Posted September 2, 2014 Share Posted September 2, 2014 (ノಠ益ಠ)ノ彡┻━┻ Amazing! Quote Link to comment Share on other sites More sharing options...
ndeboar Posted September 5, 2014 Share Posted September 5, 2014 Hey Guys, I'm keen to implment GGX into the standard 'Physicaly Based Specular' node. The one thing I'm confused by is where the cvex bsdf lives: $f = cvex_bsdf( "oplib:/com.shadeops::Shop/ggx_eval::1.1?com.shadeops::Shop/ggx_eval::1.1", "oplib:/com.shadeops::Shop/ggx_sample::1.1?com.shadeops::Shop/ggx_sample::1.1", "label", $label, If I was to add this to another node, how do I reference this path? I'm fairly new to the idea of cvex, so any extra info would be tops! Thanks, Nick www.nickdeboar.com Quote Link to comment Share on other sites More sharing options...
dennis.albus Posted September 5, 2014 Share Posted September 5, 2014 (edited) Hey Guys, I'm keen to implment GGX into the standard 'Physicaly Based Specular' node. The one thing I'm confused by is where the cvex bsdf lives: $f = cvex_bsdf( "oplib:/com.shadeops::Shop/ggx_eval::1.1?com.shadeops::Shop/ggx_eval::1.1", "oplib:/com.shadeops::Shop/ggx_sample::1.1?com.shadeops::Shop/ggx_sample::1.1", "label", $label, If I was to add this to another node, how do I reference this path? I'm fairly new to the idea of cvex, so any extra info would be tops! Thanks, Nick www.nickdeboar.com The way I extracted the code is by first creating the cvex nodes with Python hou.node("/shop").createNode("com.shadeops::ggx_eval::1.1.1") hou.node("/shop").createNode("com.shadeops::ggx_sample::1.1.1") Then you copy the code from the code section of these nodes into two different files ggx_eval.vfl and ggx_sample.vfl. You can clean up the code a bit by replacing the include parts with actual include statements: #include <pbr.h> Then you compile the files with vcc ggx_eval.vfl vcc ggx_sample.vfl and place the .vex files somewhere in a vex/CVex folder that is known by Houdini (you have specified the according environment variables). Now you can simply call them via $f = cvex_bsdf("ggx_eval","ggx_sample","label", $label, [...] ) This worked for me If you have more questions about the process don't hesitate to ask. Edited September 5, 2014 by dennis.albus 1 Quote Link to comment Share on other sites More sharing options...
ndeboar Posted September 15, 2014 Share Posted September 15, 2014 Super nice, thanks, ill give it a go! Quote Link to comment Share on other sites More sharing options...
michael Posted September 15, 2014 Share Posted September 15, 2014 forgive the noob question... just going by the Houdini surface model... how would one implement things like emission, refraction, all the other sss parameters (scattering, phase, point cloud etc)... I've been using eetu's basic example and adding to it but getting stuck pretty fast... Quote Link to comment Share on other sites More sharing options...
ndeboar Posted October 8, 2014 Share Posted October 8, 2014 Hey Dennis, Quick question about pre compiled Cvex shaders. What if a new version of houdini/mantra comes out? Would I need to recompile them? Trying to work out how this fits into the pipeline at work. Quote Link to comment Share on other sites More sharing options...
el_diablo Posted October 9, 2014 Share Posted October 9, 2014 I'm just getting into more advanced vops/vex shading and am wondering is there a simple way to 'mix' bsdf in PBR? Quote Link to comment Share on other sites More sharing options...
dennis.albus Posted October 10, 2014 Share Posted October 10, 2014 (edited) Hey Dennis, Quick question about pre compiled Cvex shaders. What if a new version of houdini/mantra comes out? Would I need to recompile them? Trying to work out how this fits into the pipeline at work. It should work without a problem as long as the specific functionality you are using are not changing (which would mean you have to change the code and recompile anyway). As vex is compiled to bytecode and interpreted on runtime it is much more forgiving in that regard. As for a new version of Houdini/Mantra you probably have separate repositories for your major versions anyway, so I don't see a problem there. I'm just getting into more advanced vops/vex shading and am wondering is there a simple way to 'mix' bsdf in PBR? You can use the Mix VOP to blend bsdfs (I feel like this answer is too obvious and you might mean something else ) Edited October 10, 2014 by dennis.albus Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.