dispersion bsdf

Recommended Posts

any ideas on how to approach (refractive) dispersion from a bsdf point of view (suitable for PBR, that is). will it help to compute dispersion the old way and simply convert the color to a bsdf and add that to the trace bsdf?

mario? jason? stu? symek?

cheers

Share on other sites

Just guessing here, because I haven't tried any of this in PBR yet, but I would think that:

1. Given that the angular spread of the dispersion is meant to lie on the plane of incidence (in the transmission hemisphere), it would seem that you could use the specular() bsdf (over multiple samples in controlled directions, much like the multiple traces of the raytraced method). A broader bsdf would just blur transmission, which is fine, but you'd still need to repeat it over the same number of samples along the dispersive "fan" of directions.

2. If I'm right about #1, then you end up with N bsdf's 'F' in N refracted directions with N ior's (one for each wavelength 'wl'), so:

F(wl1,ior1) + F(wl2,ior2) + ... + F(wlN,iorN).

Which sounds reasonable, except that each F in the summation represents a whole spectrum (as an RGB sample), and *not* a monochromatic intensity, which is what you really want (i.e: you want the result of each F to be the "weight" at wavelength 'wl' after transmission, not the weight of all wavelengths encoded as an RGB triplet). So you need to somehow "reinterpret" each F as a monochromatic distribution (single intensity in this case) before you can do the reconstruction.

3. If I'm right about #2, then it would seem that the only way to force each F's contribution to be monochromatic would be to "block out" all wavelengths except the one you want -- i.e: a filter. But you're a little limited as to what you can do with F. Multiplying it by some RGB triplet which you deem to be representative of the target wavelength would seem to be the obvious answer... except that RGB is a truly awful model for defining a spectral filter... but given that you can't directly manipulate F as though it were a color (it isn't, so you can't go RGB->XYZ->Spectrum->Filter->RGB), then some crude RGB "tinting" would appear to be the only option here.

4. Finally, you have the issues of recursive tracing that you need to watch out for: you only want the very first instance of the dispersive shader for that surface (the first time the surface gets hit) to generate the dispersion "fan". All other instances should simply extend the initial monochromatic rays. This can get tricky and has nothing to do with the global ray levels -- you need a "recursion level" for a particular object|primitive+material pair, not a global counter. This is doable in raytracing via the new ray import/export mechanism (thank god for those!), but I'm not sure what you have available in PBR mode.

I'm home sick at the moment, so if any/all of the above sounds like feverish ramblings, I blame my strep throat

HTH.

Share on other sites
I'm home sick at the moment, so if any/all of the above sounds like feverish ramblings, I blame my strep throat

... like the insane rantings of a madman!

Share on other sites
... like the insane rantings of a madman!

OMG! That explains the padded walls!

But they left me a tiny window...

... and I can see stars...

(a very faceted window with 25 samples per spectrum, that is)

Share on other sites

Oh I truly can not wait for your technical evening Mario! :notworthy:

Share on other sites
Oh I truly can not wait for your technical evening Mario! :notworthy:

Thanks old_school! Just please remind Jenny to clear it with the staff here at the Asylum first -- they get really mad when I talk to strangers outside visiting hours.

In the meantime, here's a better shot of last night's moonrise (as seen through my little window).

15 spectral samples (jittered), smaller ior spread than the last one: 1.9->1.7 (linear, which is not realistic) mapped to the 380nm->780nm wavelength range, 8 reflective bounces, using combined Fourier and Gaussian spectral representations., with no absorption/extinction, and testing the sRGB color space (previous image used CIE-E space).

Share on other sites

mario, your presence is humbling. and frustrating. and inspiring. more power to you now to find out exactly what you're talking about..

Share on other sites
mario, your presence is humbling. and frustrating. and inspiring. more power to you now to find out exactly what you're talking about..

Hey anamous,

I really wasn't trying to be obscure in my suggestions, so apologies if it came off that way. It's hard sometimes (especially in forums and with people you haven't spoken with before) to know how much information is "common ground", so you end up being either tediously verbose, or sounding like a madman. Add to this the fact that I've coincidentally been doing quite a bit of research in this area of spectral representations lately, and I'm likely to unintentionally consider something "obvious" when it's nothing of the kind (even to me, a few months back).

But as old_school mentioned, I will be (again coincidentally) doing a talk on implementing spectral colors in VEX/Mantra (on the 30th) at SESI here in Toronto. I will post the slides and all relevant code up on the Exchange once it's done. These will cover everything from "what is RGB?" to some of the gobbledygook I was talking about in my madman post.

However. In regards to your original question, I still want to emphasize that I *haven't* tried this stuff in PBR yet, so everything is conjecture on my part at this point. I do however have an idea of the kinds of issues you'll need to solve in order for dispersion to work (whether in PBR or in any other engine), and these were the items I was counting off in my first post. Looking back at that list now, I realize that the order was probably wrong (not that I was trying to sort them in any order at the time, mind you): item number 4 should be first in the list. If, after your surface has been hit for the first time, you can't somehow broadcast to secondary hits that you're now now tracing a specific wavelength with a specific ior... then that's pretty much a deal-breaker for PBR right there.

The other points had to do with the fact that dispersion is an exercise in breaking up light into as many tiny components (wavelengths) as you can, and then reassembling (reconstructing) them back into a form that the renderer and your monitor can understand (RGB). This whole color-related stuff is the part that ultimately lands you here at the Asylum with all the "happy people"

But before I go completely the opposite way of my first post and become tediously verbose, why don't you tell me which parts make you go WTF?!?!, and then we can talk about those.

Cheers.

Share on other sites
Hey anamous,

I really wasn't trying to be obscure in my suggestions, so apologies if it came off that way. It's hard sometimes (especially in forums and with people you haven't spoken with before) to know how much information is "common ground", so you end up being either tediously verbose, or sounding like a madman. Add to this the fact that I've coincidentally been doing quite a bit of research in this area of spectral representations lately, and I'm likely to unintentionally consider something "obvious" when it's nothing of the kind (even to me, a few months back).

i hope you didn't misunderstand me. no way do you need to apologize, i was just describing in overly poetic words that i (and i'm sure not just me) admire your work and dedication.

However. In regards to your original question, I still want to emphasize that I *haven't* tried this stuff in PBR yet, so everything is conjecture on my part at this point. I do however have an idea of the kinds of issues you'll need to solve in order for dispersion to work (whether in PBR or in any other engine), and these were the items I was counting off in my first post. Looking back at that list now, I realize that the order was probably wrong (not that I was trying to sort them in any order at the time, mind you): item number 4 should be first in the list. If, after your surface has been hit for the first time, you can't somehow broadcast to secondary hits that you're now now tracing a specific wavelength with a specific ior... then that's pretty much a deal-breaker for PBR right there.

not being able to attach ray labels etc to rays (infact not being able to access ray at all, just modifying BSDFs) seems to me to be one of the main issues with PBR right now. if at least we could define our own BSDFs, either by suppling some sort of reflection/transmission lobe description (i naively imagine an 3D representation with a NURBS surface) or by being allowed to have a PBR callback function that sets its own bsdf info... that would completely rock. couple that with a choice between RGB and internal spectral shading with defineable amount of spectral samples... man can dream.

The other points had to do with the fact that dispersion is an exercise in breaking up light into as many tiny components (wavelengths) as you can, and then reassembling (reconstructing) them back into a form that the renderer and your monitor can understand (RGB). This whole color-related stuff is the part that ultimately lands you here at the Asylum with all the "happy people"

i anxiously await your slide show

cheers

Share on other sites

Hey anamous,

I thought I'd give those whacky ideas of mine a try.

Barring all the problems I already mentioned, it works pretty much as I expected. Mind you, I still don't think PBR is a viable choice for this kind of thing (yet?), but just for the fun of it, here are the results of applying the steps I mentioned in the madman post:

```#define CZERO {0,0,0}
#define CONE  {1,1,1}

#define VSstart 380     // First wavelength in our visible spectrum
#define VSend   780     // Last wavelength in our visible spectrum
#define VSlen   (VSend-VSstart+1)  // Length of our visible spectrum

// Fetch the ior for a given wavelength
// A simplified version using a linear distribution. Real glasses have a
// gentle slope across the mid-to-low frequencies and ramp up quickly in
// the high frequencies.
float wave = clamp(wl,VSstart,VSend);
}

// Fetch the rgb filter for a given wavelength
// This hack uses HSV space to create the wavelength filters. Best we can
// do without a full wavelength model. Alternatively you could look up a ramp.
// No matter what though, you end up "filtering" bsdf's instead of integrating
// across all wavelengths, so it's always going to be a pale approximation
vector getfilter(float wl) {
// normalized wavelength range
float wave = fit(clamp(wl,VSstart,VSend),VSstart,VSend,0,1);
// hacky approximation to the hue spread of the color matching functions
float hue  = pow(smooth(0.2,0.9,wave),0.8);
// hacky approximation to photopic sensitivity (the Y curve)
float val  = 0.01+0.99*(smooth(0.1,.3,wave)*(1.0-smooth(.95,1,wave)));
// spectral hues don't include purples so we want 2/3 of normal HSV range
return hsvtorgb((1.0-hue)*2.0/3.,1,val);
}

// As mentioned, due to the fact that we can't alert secondary instances
// to the fact that we may be computing a monochromatic ray, this puppy just
// blindly creates a full dispersion fan at every intersection.
surface pbr_dispersion (
float  ior_mean   = 1.55;
int    samples    = 15;
float  Ktrans     = 1;
float  Krefl      = 1;
)
{
// assume surface normals are correct (no forced front-facing)
vector wo   = -normalize(I);
vector n    = normalize(N);

// fresnel variables
float kr,kt; vector wr,wt;

// init loop vars
float  wl   = VSstart;
float  dwl  = (float)VSlen/(float)(samples-1);
bsdf   Ft   = 0;
bsdf   Fr   = 0;

// loop over wavelengths accumulating filtered specular bsdf's
vector w    = 0;     // will hold the filter (rgb weight) per wavelength
vector wsum = 0;     // will accumulate all the rgb filter weights
int    i;            // loop|sample counter
for(i=0;i&lt;samples;i++) {
fresnel(-wo,n,1.0/ior,kr,kt,wr,wt);
w = getfilter(wl);
// transmit
Ft += specular(wt)*kt*w;
// reflect
Fr += specular(wr)*kr*w;

wsum += w;
ior  -= dior;
wl   += dwl;
}

vector iwsum = CONE / wsum;
F  = Ft*iwsum*Ktrans + Fr*iwsum*Krefl;

}
```

The PBR result (the green bias is due to an inaccurate approximation to the hue distribution of the color matching functions -- it can be tweaked to match better):

The Ray traced result for reference. Similar settings as the PBR one with the same number of spectral samples, but actually operating at the wavelength level.

All images have had the standard (sRGB, CIE-E, etc) 2.2 gamma correction applied.

I suppose the good thing about having a PBR approximation is that you could presumably use it to generate caustics, though I haven't tried it yet.

Cheers!

• 1

Share on other sites

oh mario, you just had to go and ruin the fun obviously, as ever, amazing stuff.

I had a go at this, and here's my result. however, instead of using my brain (or rather using your brain ) I ended up cheating myself through. simply sampling around the main eta and using a ramp texture as a "colorizer" for the samples. looks ok at first glance, nothing fancy. renders pretty quick, too, this one used 15 samples. of course, not being able to control the secondary rays means that from "glossy bounces > 1" on everything underneath the glass starts to look really funky and takes ages to sample.

thank you so much for sharing. will try it out as soon as I get to it.

Share on other sites

Hey! Cool!

Let me know if you have any luck with colorful caustics. I'm going to give that a whirl next. Given that we no longer have a photon shading context, it seems that I should try to make at least that part of it work in PBR... I think I'm starting to miss the photon context...

Cheers.

Share on other sites

Bah. I don't think this will ever work in PBR without some means of tracking monochromatic bounces. Here's a nasty hack which allows dispersive fans only at global ray level (grl) 0 on the object itself, but up to grl 1 for photon generation. Those ugly big circles are the result of too many dispersive fans.

But I just noticed an auto-generated IFD boolean property: photon:photonshaders which is presumably set to true when objects use photon-context shaders instead of surface shaders for photon generation. Maybe there's still hope for ye olde photon context after all!

Share on other sites

that looks so nice! look, i made disco lights, too on the left my hack, right your shader.

what do you mean by global ray level? i thought this doesn't exist in PBR... and what I don't get is why the photon context got canned, it was the best of its class. I hope it or something even more flexible resurfaces soon.

Share on other sites

Yay! Disco balls!

Maybe I spoke too soon about the photon context... maybe...

It appears that the photon context itself is still alive (compiles and can be assigned to objects, etc). But they've either changed the way it works, or I'm doing something stupid. Here's what I'm seeing:

1. Pre-generated photon maps only render when using a PBR engine. (unless you load them directly from your shader using photonmap() I guess, but I haven't tried that yet). Valid photon maps (ones that show up in a PBR render) do not show up when rendering with a non-pbr engine. A PITA.

2. Assigning a photon shader (no surface shader) to an object and rendering with the photon engine will produce a photon map. However, I haven't yet been able to make this map "visible" -- tried a few things but the energies are way too low, and the actual photon locations are wrong too... it just seems to be doing the wrong thing no matter what I try.

3. Multiple-bounce dispersion in PBR is a bust because of the lack of "ray labels", yet we're forced to generate these photon maps in the PBR context. Trapped! ... but wait...

I accidentally left the ray-tracing version of the shader (no assignments to F) on my object while generating one of these photon maps. I expected a black map but I tried it anyway.... the map was valid and lacked all the ugly artifacts of the PBR version -- it even worked with an explicit assignment of F=0;. My reaction: Jesus Christ! WTF?!?!?

OK. So...

It appears that the old photon context is, uhmm, if not dead, then comatose. it would seem that we can escape the PBR limitations (for doing dispersion) when generating photon maps, because we can use a ray-based shader. Yup... I'm still scratching my head.

Anyhoo... here's a version of that little shader that won't "go green" on you

```//#include &lt;spectrum.h&gt;

#define CZERO {0,0,0}
#define CONE  {1,1,1}

#define PVSstart 380     // First wavelength in our visible spectrum
#define PVSend   780     // Last wavelength in our visible spectrum
#define PVSlen   (PVSend-PVSstart+1)  // Length of our visible spectrum

float wave = clamp(wl,PVSstart,PVSend);
}

vector getfilter(string ramp; float wl) {
vector ret = 0;
float wave = fit(clamp(wl,PVSstart,PVSend),PVSstart,PVSend,0,1);

if(ramp!="") ret = texture(ramp,wave,.5,wave,.5,wave,.5,wave,.5);
else ret = hsvtorgb((1.0-wave)*2.0/3.,1,1);

return ret;
}

surface pbr_dispersion (
string ramp       = "";
float  ior_mean   = 1.55;
int    samples    = 15;
float  Ktrans     = 1;
float  Krefl      = 1;
int    level      = 1;
)
{
vector wo   = -normalize(I);
vector n    = normalize(N);

float kr,kt; vector wr,wt;

if(getglobalraylevel()&lt;level || level&lt;0) {
// init loop vars
float  wl   = PVSstart;
float  dwl  = (float)PVSlen/(float)(samples-1);
bsdf   Ft   = 0;
bsdf   Fr   = 0;

// loop over wavelengths accumulating filtered specular bsdf's
vector w    = 0;
vector wsum = 0;
int    i;
for(i=0;i&lt;samples;i++) {
fresnel(-wo,n,1.0/ior,kr,kt,wr,wt);
w = getfilter(ramp,wl);
//w = maxv(CZERO,xyz2rgb(CSNDX_CIE,cmf(wl)));
// transmit
Ft += specular(wt)*kt*w;
// reflect
Fr += specular(wr)*kr*w;

wsum += w;
ior  -= dior;
wl   += dwl;
}

vector iwsum = CONE / wsum;
F  = Ft*iwsum*Ktrans + Fr*iwsum*Krefl;
} else {
fresnel(-wo,n,1.0/ior_mean,kr,kt,wr,wt);
Fr = specular(wr)*kr;
Ft = specular(wt)*kt;
F  = Ft*Ktrans + Fr*Krefl;
}

}```

It takes a ramp, so here is the ramp of the color matching functions, in the [380nm,780nm] range, transformed to RGB (in equal-energy space CIE-E). These are the raw values in linear space (for display you'd add a gamma of ~2.2, but leave it linear for rendering), including the negative values that result from the transformation. You might want to clamp at zero for shading, but I thought I'd give it to you raw so then you can do whatever you want.

CMFramp_linear.exr.gz

I also added a "level" parameter: the shader will produce a fan only when the global ray level is below this value (yes, a hack, but useful). Any instance at or above the given level will produce a single distribution at the mean ior. This lets you do multiple bounces (10 in the image below) without it exploding. When "level" is set to -1, every instance will generate a fan, as before.

Here's a test using the CMF ramp, with level=1 on the object, 10 bounce limit, and a photon pass generated with a raytracing version of the shader (I removed the light's contribution to see the caustics better).

Cheers!

• 2

Share on other sites

wow. what you're saying is that in photon generation mode, I can use a normal surface shader and output Cf and Of, and the photons will transport that those as "energy"? And afterwards I'd set up a PBR pass to perform final gathering with the photon map? I have to try this out immediately.

Share on other sites

I tried to figure out photons in H9 for a while with no success. Great to see, someone is interested in it also (if this is Mario, we have a good chance to solve some puzzles). I far as I understand this topic, photons generated in PBR mode, should work in standard mode's photonmap() call as usual. And they do except I've hardly seen any result of that - meaning I wasn't able to produce any proper image with photnmap() call. Looks like a photons from PBR have different values then standard mode expects.

Photons are still needed. I don't know why SESI left it behind.

Share on other sites
wow. what you're saying is that in photon generation mode, I can use a normal surface shader and output Cf and Of, and the photons will transport that those as "energy"?

I sent off a question to support to see if this whole photon issue can get cleared up, but yes, that's what I'm getting in my tests. That last image I posted was done exactly that way. (As a matter of fact, I explicitly set F to zero to see if I could brake it, but it went ahead and computed a valid/plausible photon map anyway).

@SYmek: yes, I haven't been able to get anything approaching the old behaviour with those VEX functions either (photon_transmit(), photon_reflect(), etc.). I haven't tried photonmap(), but I'm not surprised that you've had no luck with it. All this leads me to believe that the old photon interface is, if not dead, then strolling through purgatory...

Mario Puzzled In Toronto Marengo.

Share on other sites

hi guys

i'm trying to get color caustics going here but with no luck,

Mario: i'm using your PBR shader as is, it should work right? i take it i don't need the raytraced version?

thx

jason

EDIT: sorry i guess attaching my hip would actually help!

PBR_caustics_setup_01.hip

Edited by jason_slab

Share on other sites
hi guys

i'm trying to get color caustics going here but with no luck,

Mario: i'm using your PBR shader as is, it should work right? i take it i don't need the raytraced version?

thx

jason

EDIT: sorry i guess attaching my hip would actually help!

PBR_caustics_setup_01.hip

Hey Jason,

Sorry I don't have time to look through your file right now, but I did a quick test with a similar shader, and the hack still appears to work as far as dispersion goes. Custics... well, IIRC, I had done those with the raytrace version of the shader, so maybe try that.

Anamous?