Jump to content

Search the Community

Showing results for tags 'colorspace'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • General
    • Lounge/General chat
    • Education
    • Jobs
    • Marketplace
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 1 result

  1. Hi all, I stumbled across this video about using filmic LUTs in blender: https://www.youtube.com/watch?v=m9AT7H4GGrA This led me down a rabbit hole about how color works and is managed in Houdini. First, what colorspace does Houdini work with natively? I would expect it to be linear sRGB, like that it's using the same primaries but working of course in a linear space. Then textures that come in as ACES for example would need to be transformed internally to sRGB to be workable? As Houdini has been around much longer than ACES, I don't suppose it's working with this internally. So as an example: If I create a shader with an emission of R:1,G:0,B:0 - the renderer will display in sRGB a value of 1,0,0 and if I save that as png, which converts it to sRGB when the transform into colorspace is active, the same values are there - which is why I think it's using the same primaries. Secondly (this is related now to the above video): At work we use the OCIO ACES color management with a display LUT of "sRGB" - however it doesn't specify what color space goes in. If I'm correct with the above statement, that Houdini works in linear sRGB, then the ACES color management software would have to translate linear sRGB into ACES and ACES back into sRGB (non linear) for display, right? This then leads me to another point from the video - where he uses LUTs to preview his renders within Blender that are based on ACES. However he has several LUTs to choose from that give him different looks (high contrast, low, ...). As a lighter I can see how this has a huge influence on how I light the scene. Again, in the video he shows that without a LUT the image in the preview of Blender looks too contrast-y, because values above 1 immediately get clipped. As we don't want to produce clipped images (even if the data isn't clipped but the artist sees the clipped preview) they will use less bright lights and overuse fill lights to compensate for missing bounce lights. But with a LUT you can squash much higher contrast levels into the display range of 0-1 - so the sunlight could have crazy values, there'd be a lot of bounce lights and still the highlights wouldn't clip. I guess more similar to what we and also a camera sees. How can I be sure now that the standard "sRGB" LUT we use at work is ok for the scene/show I do. Is there one fits all or should we - as I know they do on sets - create different preview LUTs for different shows?Has anybody experience with using different LUTs? I could imagine even taking one from a real camera - what for example a Canon DSLR uses to convert their raws into 8bit (display referred) images. Or I create one myself, but then how do I know what is physical and what isn't? Cheers, M