At this point the conversation is going nowhere, I do have a point (what you requested) already explained in the last two posts, and saying more than that it’s simply more noise to the thread.
In the previous posts I made big efforts on explaining myself and now we are nitpicking quotes arguing about small words, so this is my last elaborated post of the conversation.
Given how you still ask things I explained I don’t think you are making the littlest effort on understanding and even less on explaining things. But this is expected from an ACES team member whose duty is no more than to enforce the specification (that studios break time and again within reason to get the job done). Would you explain how do you go from RAW acquired HDR ACEScg Albedo map to LDR ACEScg PBR Albedo?
Absolutely not, studios across the world have been working with filmic View Transforms for many-many-many years. This presentation about games (not films but games) by @hpduiker is from 2006.
Yes, more or less around the time we realized that we had to linearize our gamma encoded textures. Feels like yesterday.
In the scope of look development (the topic in discussion) filmic looks got popular with the implementation of OCIO such that now we can apply looks in the DCC’s framebuffer “on the fly”.
I disagree with that, who is to say that all the sRGB materials are authored with a filmic look in the first place? Making incorrect assumptions can be dangerous and might bite back.
I’m not saying all, I’m saying likely, but let me quote you on these earlier words
Resources coming from outside, e.g. Google Images, are usually rendered, i.e. likely to have an S-Curve applied…
Referencing material with embedded looks leads to embedding looks to materials.
The clear evidence that Epic and other studios had to compensate for that shows that the authored materials were film look embedded. It’s in your linked “ACES Retrospective and Enhancements” pdf, part III. A. Framestore, Unity, EA… Everybody is breaking the hard-coded specification because it makes little sense to throw an established ground truth out of the window.
"Often, when the ACES system is used, the client look transform is concatenated with the inverse RRT and inverse ODT into a LMT so as to completely cancel out the ACES look. Recent experience at Framestore and Eclair has seen projects where this has been the case. Even outside of the motion picture industry, discussions with Unity Technologies have shown that the RRT contrast was deemed too high"
Reading into it, it’s obvious they are reusing old assets (sRGB authored ground truth materials) directly into ACES with your recommended IDT. I don’t think the RRT is contrasty or harsh, if you author your material from scratch under the ACES sRGB viewing transform you have total control of the look of your material and Albedo map. Honoring the old sRGB ground truth material is my personal solution to a common problem.
To further expand on how ubiquitous embedded filmic looks are in authored materials check this GIF I made. It’s the PBR SmartFit filter mathematically correcting (and being generous in the limits) Substance Painter bundled materials, proof enough that they were not even PBR and far less physically accurate acquired maps.
And this is a repeating pattern in other packages and shared materials. This is only for PBR which can be easily fixed even when the filmic look is embedded, so guess figure how many PBR valid but physical inaccurate materials are floating around.
The complex BxDF and light transport interactions are what makes your suggested workflow something I would not consider to start with because applying the Reverse View Transform on textures will not get back to the previous rendered sRGB View Transform look anyway
I already explained that.
the material (passed through the sRGB viewing transform) matches our sRGB ground truth, the render doesn’t because the RRT is applied also to the light, something we couldn’t embed when authoring the sRGB material.
To prove it check the sRGB color charts in OP. When light component is removed with a surface shader you get a match of the ground truth
To your question:
How many times are surface shaders that are not light sources rendered and end up in final frames on shows?
On the “complex BxDF and light transport” and surface shader note, following a diffuse reflection model the next quote should adhere.
Diffuse Albedo: How bright a surface is when lit by a 100% bright white light (...) with 1 in brightness and point it directly on a quad mapped with a diffuse texture, you get the color as displayed in Photoshop. (Sebastian Lagarde)
So pretty common. The principle of a Macbeth chart based color correction.
I shoot myself a lot of HDRIs, process them with my own code, and I use sRGB as encoding space very often.
When half of the work is not correctly done (tagging in the OpenEXR chromaticities attribute), you can’t do worse. Wide gamuts make sense with higher bitdepths common in high dynamic range images to prevent quantization errors. A correctly authored HDRI is in a wide (wider than sRGB) gamut and tagged in the file’s metadata if not explicitly informed.
As to why AdobeRGB and not other space, I did a research and AdobeRGB was a constant color space on creation of HDRIs, there were not much more choices than that and sRGB in the processing tools.
A 32-bit float HDR image in sRGB doesn’t make much sense, another reason why I am using ACES for renders.
Who is to say that the HDRI author did not simply desaturate it? A ton of people online… end up with ACR tone curve embedded
If you start tweaking curves you are doing it wrong as now it doesn’t represents the environment’s scene referred linear light anymore. Which takes us back to the beginning when we built our materials from non-accurately acquired resources (photographs, HDRIs) and simply tried to match the look of the material (against the reference) by artistically reverse engineering it. The same is happening with HDRIs, people is fine tuning it to represent what they consider is a good photo not an accurate representation of spectral data.
(if the HDRI) does not ship with a Colour Rendition Chart, he cannot expect to have physically correct results
A Color Checker or Colour Rendition Chart is not usually representative of physically correct colors and tones, and my following project goes in that direction. Check this video.
- Closing
I’m certainly not enjoying on defending your persistence on dismissing my words and hence my work. I’m losing my time and I guess you too. When I see someone did or said something I don’t agree I don’t go to his presentation thread and insistently try to put down the work, I didn’t see you do this to Framestore, Epic games and so on. I’m presenting a valid point, a solution to a recurrent topic. You can express your well-founded opinion, ignore it or download it and use “Utility - sRGB - Texture” in the ACEScg filter. I’m not shoving this down to anyone’s throat, I shared the tools for free to allow any workflow.
An image speaks by itself. I post the correct self explanatory comparison image that should replace the one in the original post.
- old sRGB ground truth 2) similar to “Utility - sRGB - Texture” IDT 3) Epic Games style global look compensation 4) sRGB ground truth honoring in an ACES environment (RRT still applies to light/shading)