I would say that if for your projects your target deliverable is always 709 broadcast/web it’s a totally valid choice in order to remove the above mentioned hurdles.
Unreal Engine 5 has user choice of working space though, with ACEScg being the default even if I’m not mistaken.
which View Transform / Image Formation Pipeline / DRT are you planning on using with the working colourspace Linear Rec.709 instead of ACEScg?
A downside could be …imagine you get sent a plate / HDRI in ACEScg and you want to use it in linear-Rec.709. After a colorspace transform from ACEScg to linear-Rec.709 you could end up with negative values… as an example.
We would want the ability to deliver in AP0, and the reference space would not be Linear Rec709, but most likely it would be Linear CIE-XYZ E. Only the working/rendering space would be Linear Rec709.
The problem with 709 as working space potentially is having negative values at that particular moment in the chain, because that’s where you do your operations, regardless of what transform happens before or after. And multiplying a negative value with a number larger than 1 pushes it even further out of the gamut. So for grading or compositing with some tools/operations this can be problematic.
Ah, okay I’m following now! I wonder if it would be possible to render in one space and composite in a another? Nuke uses “scene linear”, and if Maya uses “rendering” so I could potentially set the config to
rendering: Linear Rec.709 (sRGB)
scene_linear: ACEScg
I like to see the working/rendering space as a variable that we can choose based on the project.
We rendered lego batman in P3, super mario in ACEScg and pookoo in rec.709.
If you are going down the road of not “matching” the rendering and scene_linear roles, I would be super cautious to check in all your DCCs how they behave (Substance Painter/Designer for instance).
Right, and Substance Painter uses the scene_linear role for the working space so that would mess things up between SP and Maya. Sounds like it would be best to just set it to one config per project as you were saying, rather than switching between software which could end up messing up the color management.
Just set the necessary roles to “lin_rec709” and you´re good to go. In that case, any primary set (like in a color picker) would not be ACEScg but “BT.709” ones.
It does mitigate a bit some of the “clipping” issues.
Hair melanin and redness: The standard_hair shader’s melanin and melanin_redness parameters as well as the AidEonBSDF() function’s absorption parameter are now being referenced in a linear sRGB color space so that the direct lighting results match when switching to other rendering color spaces such as ACEScg. The melanin_redness parameter will also now produce a more correct-looking red tint in ACEScg. (ARNOLD-7585)
Yeah I had wondered about that too. I do know that AgX uses a linear Rec709 working space. It does pass through Rec2020 in the image chain, but honestly the particulars about that are over my head. Hence my wanting to bring this up here.
Hi,
coming back to your initial idea, now that some of your questions and doubts could be answered…
Is it a good idea to move forward to use the smallest working colourspace instead of a bigger one?
Grading tools have very large working colourspace gamuts (E-Gamut, DaVinci WideGamut), as far as I know, RAW photo developer tools use linear-ProPhoto or linear-Rec.2020, which are both quite similar to ACEScg. FCPX, as far as I know, uses linear Rec.2020 as its working colourspace.
As long as the DCC apps can manage the big gamuts on the way to the display, I would think it makes sense to try to make use of them?