Jump to content

Arnold deep EXR output problem


Recommended Posts

Hello,

I am working on a school project and I am using Arnold in Houdini for rendering. I would like to try deep workflow. But I have a problem with deep sample count in rendered EXRs. I am using custom ID AOVs and when trying to isolate certain objects in Nuke there are artefacts on edges (which is funny because it shouldn't happen with deep data).

 

I think there is a problem with deep samples merging (compression) which is controlled by tolerance parameters. But modyfiing these parameters doesn't take any effect on exported EXR.

When samples merging is disabled EXRs in Nuke work like expected but it drastically increases EXRs size (2gb per frame).

Any ideas how to find a balance between EXR size and sample count on edges?

 

I don't know whether it is a bug or I am missing something. I have tested it with htoa 1.3.1 and 1.4.0.

I am really stuck here.

 

merge disabled:

post-11577-0-14253400-1432670033_thumb.j

 

merge enabled:

post-11577-0-99899800-1432670034_thumb.j

Link to comment
Share on other sites

Hi, thanks for your reply.

The problem is that adjusting tolerance values (beauty, deep, alpha, per AOV) does not take effect on EXR size.

I have tried also very extreme values (both large and small) but output's size is always same and deep sample count as well.

I also tried to set all tolerance values to 0 which should be equivalent to disabling deep sample merging, but output EXR had about 50MB (when deep sample merging is disabled EXR has about 2GB).

Link to comment
Share on other sites

I am attaching 4 ASS cases:

tol_0 - merge subpixel samples enabled, all tolerances set to 0, produces artefacts, 73MB

tol_def - merge subpixel samples enabled, all tolerances set to default, produces artefacts, 72,8MB

tol_small - merge subpixel samples enabled, all tolerances set to small value (0.0001), produces artefacts, 72,9MB

tol_dis - merge subpixel samples disabled, artefacts-free, 1.56GB

 

Forum rejected to include ZIP or ASS files, so attachment is available here: http://we.tl/wdry5XRayt

 

Thank you for your help.

Edited by Juraj Tomori
Link to comment
Share on other sites

Thanks for the repro. Could you send me the maps as well?

 

We've noticed your obj_id and list_id AOVs at of type FLOAT. They should be of type INT, as this will prevent any merging of different IDs. Maybe that's what is causing your issue. Could you verify?

Link to comment
Share on other sites

Hi,

there are maps: http://we.tl/VLeLQ5uNa6. But paths to them are not specified on shading level but as string attribute. But maybe you can replicate folder structure if you are running on windows - put textures into D:/tomori/korene fx/h/tex/ . If not it will be better if I prepare a test scene which will be simpler to share.

 

I have already tried setting a custom AOV to INT type but I did not manage to get it working. (it is strange because in flat output a INT AOV works and even in render view window in houdini INT AOV is displayed when deep output is set). When I load deep exr into Nuke with deep INT AOV then this layer si black.

When outputting ojb_id AOV as RGB there is the same issue with edges.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...