Jump to content

Rendering an image sequence with alpha channel and 8bpp


Recommended Posts

Hi all, I'm having some troubles rendering an image sequence with an alpha channel.  FYI I'm writing to the alpha in my Houdini shader.

 

So if I render to mPlay, the alpha is present and correct.  If I save the file from mPlay it is also correct.  If I load the file in Photoshop it has a separate Alpha channel below the RGB channels and each channel is 8bits per pixel.  

 

This is what I've tried, I had Quantization = 8bit for all:

 

  • If I render a single frame to Tiff or PNG I get the alpha channel, but in Photoshop it makes the base layer transparent instead of creating an extra Alpha channel in the "Channels" panel.
  • If I render a single frame to Exr, I get a correct Alpha channel but each channel is 32bpp instead of 8bit like I set in the render options.
  • If I render a sequence of frames and to Tiff or PNG, no alpha is present at all.
  • If I render a sequence of frames to Exr I get a correct Alpha channel but each channel is 32bpp instead of 8bit like I set in the render options.

 

Is there a way to render a sequence of frames to an image that will result in 8bits per pixel and a separate Alpha channel when loaded into Photoshop?  Ideally I'd like to render directly to a DDS volume texture but that's not going to happen  :)

 

Thanks!

 

Geordie

Edited by GeordieM
Link to comment
Share on other sites

marty thanks for that link!  LOL that looks like a usual thread from Adobe.  I once had an argument about them not natively supporting DDS textures in Photoshop.  Textures are an array of data, nothing more, I should be able to store my grandmother in the alpha channel if I want to.

 

This is what I'm doing:

Render volume gradient (processed into normals) out in RGB

Render volume density out to A

 

The final images needs to be linear 8bpp with A kept as a separate channel and preferably a TGA to be correctly imported into our game engine.  I'm animating a volume slice through the volume to give me slices of the density and normals. 

 

The problem with Exr is that it's HDR which means I need to sample the bit depth down to 8bpp, which means I have to apply a corrective gamma curve to keep it linear, which means my pixel values have been altered post rendering.  I need to be able to output un-altered vector data directly from Houdini into images.

 

I guess the fix for now will be to render the RGB normals and the A density out to two different images, but that makes me sad :(

 

Or I stop relying on the render pipeline and write something to output the data directly into textures.

 

Still would be good to clarify my original post, seems a bit weird that rendering a sequence would change the image data.

 

TNKS!

 

G

Link to comment
Share on other sites

Thanks edward.  I just finished a 3DS max pipeline where I write animation data out to textures in Maxscript so I think I'll eventually do the same for this in Houdini.  I've split the volume normals and density out to separate image planes for now, it works fine just have to do some post render processing.

 

TNKS!

 

G

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...