Hello,
I am working on a school project and I am using Arnold in Houdini for rendering. I would like to try deep workflow. But I have a problem with deep sample count in rendered EXRs. I am using custom ID AOVs and when trying to isolate certain objects in Nuke there are artefacts on edges (which is funny because it shouldn't happen with deep data).
I think there is a problem with deep samples merging (compression) which is controlled by tolerance parameters. But modyfiing these parameters doesn't take any effect on exported EXR.
When samples merging is disabled EXRs in Nuke work like expected but it drastically increases EXRs size (2gb per frame).
Any ideas how to find a balance between EXR size and sample count on edges?
I don't know whether it is a bug or I am missing something. I have tested it with htoa 1.3.1 and 1.4.0.
I am really stuck here.
merge disabled:
merge enabled: