Jump to content

Linear Workflow problems


abvfx

Recommended Posts

Hey Hoodniks,

Ive been having aesthetic problems working in a linear workflow for a little bit now. Ive searched on every topic i cn find and it seems no one else has this problem.

Ive done the whole preferences > color settings > gamma set to 2.2 (for mplay and all the other viewers) and done the crude color correction of sRGB textures to 0.45 but my gripe is about shading.

When rendering even basic lambert shaders or even more complex ones like the Mantra surface, I am unable to have normal diffusion of light across a surface, or able to have dark shadows or even bight highlights without affecting the entire image. It appears incredibly flat.

linearWorkflow.jpg

Better Example

http://docs.unity3d.com/Documentation/Images/manual/LinearLighting-0.jpg

On the Right is a simple render, rendered at 1.0 gamma. I attempted to recreate this picture in 2.2 space by just turning the up the intensity. And the result is on the left. I was mainly trying to get inbetween the specular and lambertian shading because to close to the surface color and the spec highlight is about half as intense as the previous render.

The specular highlight isnt as bright and more importantly, the lighting doesn't falloff like it did in the previous render. No matter the setting, im unable to get light to fall off properly, it is just really flat.

I must be doing something wrong because i know people have been working in a linear color space with Houdini for years. What is going on?

btw this also affects shadows too but i just wanted to put up a simple example to see if there is clearly something im doing wrong.

Edited by phrenzy84
Link to comment
Share on other sites

but the falloff isn't going to look like the gamma 1 render, just doing a quick Google search you'll find that a gradient at 2.2 is quite different than at gamma 1.

so your gamma 2.2 is actually has the correct falloff.

http://freesdk.crydev.net/download/attachments/131627/gammaref2.png?version=1&modificationDate=1313159971000

Link to comment
Share on other sites

So what can i do? Not that lighting is my primary flied but now I feel unable to judge light in my renders.

Lighting and shading in linear space looks different because all of the internal calculations you're used to done with gamma were incorrect (wrong, flawed, however you want to say it). In a sense you have to unlearn what you have learned about lighting and shading. It takes some getting used to but in the compositing stage there's a lot more flexibility to achieve a broader range of looks. It also makes it easier to match plates and reality since the calculations more closely resemble the way color and light actually behave.

Link to comment
Share on other sites

when I switched to linear workflow, everything felt strang at the beginning, but stick to it Your light setups get simpler and if You want to use something like physical correct lighting it is only possible with this. soon after switching I had to shade a white object, with the flawed workflow it was very tricky to get it not dull, but with the linear workflow it was much easier.

Link to comment
Share on other sites

Linear workflow it is, i remember watching one of the light, shade, render videos from SESI. They recommended this video for a better understanding.

Guess its one of those things to get used to. Thanks for the fast replies. :)

Link to comment
Share on other sites

I am also new to this linear workflow in rendering. So are we supposed to de-gamma every texture (except displacement, bump map) etc before rendering? If so, is there a setting for the images that we can set when bringing them in? Or a mantra specific render setting that will de-gamma all the appropriate textures automatically?

Also if this is the correct way of rendering, then why is it not the default in Houdini? I saw Peter Quint's pill tutorial and he was setting the Gamma to 2.2 in the Color Settings himself.

Hope phrenzy84 doesn't mind me posting it here :)

Link to comment
Share on other sites

You have to degamma it youself. just add gamma 0.454545 on you images before You write them to disk. but be carefull, proper hdris are already correct, same goes for most exrs,. but in case you have textures from photoshop You can be quite sure that You have to degamma it

You also can create a custom texturevop that has the degamma inside, or ad a collorcorrect node after the texture but this will run multiple times per sample, per frame during rendering so doing it at the beginning is better

displacementmaps should also be linear already. but it never hurts to check where they come from

for the preferences You can make the 2.2 setting default, as it is probably the only way You want to work

  • Like 1
Link to comment
Share on other sites

Also if this is the correct way of rendering, then why is it not the default in Houdini? I saw Peter Quint's pill tutorial and he was setting the Gamma to 2.2 in the Color Settings himself.

The correct way of rendering is to render in linear color space, not to linearize every texture. The only way application might guess in what color space the image was saved is its file extension, as there is no standardized way of saving this data in a file (some formats have special tag, some don't). This is what Nuke's doing, but obviously it can be easily fooled by jpeg saved from Mantra (will be linear by default and Nuke will treat it as sRGB).

Since Mantra reads textures with vex, the place this logic would have to take place is a texture() function. That would be complete mess, if texture() was treating textures differently according to their extension. This is no way to go. Another possibility is to make some callback on vop GUI level (easily to make by anyone), but it won't work with compiled shaders.

Besides of having a texture() call with "gamma" or LUT variadic argument (which would have to be controlled by an user anyway), I don't see any elegant solution for doing auto-magic color space management.

  • Like 1
Link to comment
Share on other sites

Besides of having a texture() call with "gamma" or LUT variadic argument (which would have to be controlled by an user anyway), I don't see any elegant solution for doing auto-magic color space management.

That's what Arnold does. It works very well.

Link to comment
Share on other sites

Thanks guys for the insights. I was hoping some sort of automatic assumptions like Symek said via a setting, which you turn it on in your mantra render node when you are 100% sure your textures need de-gamma.

Is Nuke doing this by default with no ability to turn it off? Where does it store the de-gamma versions of the files? In memory?

Thanks again :)

Link to comment
Share on other sites

Is Nuke doing this by default with no ability to turn it off? Where does it store the de-gamma versions of the files? In memory?

It can be disabled or changed as it's just a knob on a Read node. Conversion most probably happens on the fly, so yes, corrected pixels are kept in memory.

  • Like 1
Link to comment
Share on other sites

Thanks Symek, it makes alot of sense now. To replicate the same thing in Houdini on a global level, the Image parameter type would need to have a Gamma option then, right?

If that existed, do you think this would be a reasonable way handle this issue in an optimal workflow? To me this seems like an improvement but not sure if it would slow down the rendering greatly to lose the benefits.

Link to comment
Share on other sites

That would be complete mess, if texture() was treating textures differently according to their extension. This is no way to go.

... <snip> ...

Besides of having a texture() call with "gamma" or LUT variadic argument (which would have to be controlled by an user anyway), I don't see any elegant solution for doing auto-magic color space management.

What about both? If you didn't explicitly specify, then it uses the Nuke-style way, else it uses the explicit override.

  • Like 1
Link to comment
Share on other sites

I would love to see the Nuke way of handling images implemented in Houdini, I do find it frustrating that SESI hasn't invested more time in the whole linear/physically correct way of working. I feel it would make it all less confusing to users when they don't have to try implement linear workflow themselves.

H15 maybe?

  • Like 2
Link to comment
Share on other sites

Yeah, i think i would've had a slightly easier time had it been default but then how would this conversation started. I ve already kinda got used to working this way, but it still feels a little strange when tying to make very soft falloffs.

Link to comment
Share on other sites

On a side topic, does anyone know how color management in PhotoShop works? From what little I know, if you save out a 16-bit TIFF from PhotoShop, then it will always be in linear. If you save out an 8-bit TIFF, then it is in sRGB?

It seems to apply a policies and profiles based on preferences, and tags found in a file. While converting from float to integer (8 or 16) it forces to choose exposure/gamma, but besides that files saved as 32/16/8 will look the same (nothing is done on write). Most of the management/conversion seems to happen on read. All integers files are treated as gamma corrected sRGB images, while floats will be treated as linear (and coverted to sRGB on load). It quite messy as we see it (and most photoshop artists don't care).

What about both? If you didn't explicitly specify, then it uses the Nuke-style way, else it uses the explicit override.

Yes, it might work, not sure how this would play with parameters in TextureVOP promoted on shop level. What I'm sure about is that I wouldn't be happy having +20 Color Correct VOP inside UberTextureVOP in our shaders.

As a side note, it would be really helpful to upgrade tools like icp, iconvert to read luts and gammas parameters.

Edited by SYmek
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...