First post here so please be gentle. We are using unreal engine as part our lighting department in our vfx pipeline. We plan to produce acescg exrs from Unreal which will then be brought into Nuke for compositing. Pretty much like a traditional vfx workflow.
The problem we are having is with the ACES filmic tonemapper. Looking at the docs for Unreal they say that there is support for exporting HDR data in exr on custom passes. There is an option for changing the capture gamut when storing the captured HDR data.
I assumed that this would equate to the working/rendering colourspace as in an offline renderer like Arnold or Mantra but maybe this assumption is incorrect.
The way that aces has been integrated into Unreal seems different to other dccs. There seems to be no way to change the view transform for different display devices (rec709, p3, srgb etc). They also seem to expose the curves for aces tonemapping which doesn’t seem to align with the aces standardization. There is also no OCIO support.
When we rendered out an image sequence as exrs using the acescg capture gamut, the result didn’t look as expected. For example if I export an exr with these settings:
If I view the image without a view transform (raw) the results seem to match the viewport in unreal.
Nuke viewing raw
This implies that the image has been exported with the tonemapper applied and not the linear data, like I would expect an offline renderer would produce.
We have found a way to disable the tonemapper then only apply it to the exported exrs which seems to produce a correct result:
This means that we can render out the exrs in linear and continue our pipeline as a regular vfx pipeline.
Now we just want to make sure the viewing experience in Unreal and Nuke match.
Has anyone found a way of creating/modifing any current OCIO aces view transforms to match the UE4 viewport?
Any help would be much appreciated.