So I am getting quiet a bit of feedback in how the default output-sRGB is “too contrasty” , “too punchy” from people in various departments, mostly lighting.
So I compared them to the Arri709 lut with a 709->sRGB transform added, and yes they are a bit “harder”
Is there a good reason that they are like this?
Are people generally using cdl or LMT adjustments in their OCIO or something to lower the contrast and still keep the versatility of the different ODTs for different displays?
Ideally I would like to have something similar to the arri rendering/tonemapping across srgb, bt1886 and DCI-P3 projectors, and using a LMT works pretty well for this, but maybe their are better ways.
Simply adjusting the gamma in the viewer is sadly not working because of a limitation with Vrays Framebuffer/IPR viewport.
I have the feeling I answer this question every 3 months
Irrespective of the ARRI comparison, a few questions:
Are your displays calibrated to 100 nits and are they using the sRGB EOTF (not gamma 2.2)?
Is the surround of your artists dim?
Those are the requirements to make a proper assessment, but even with those checkbox ticked you might still find it too contrasty. The practical and adopted solution is to overexpose slightly the imagery.
Sorry you have to answere this so often So yes we have srgb screens, different manufacturers though (hundrets of screens across many years of buying screens)
but they are not 100Nit, they are usually brighter, depending on artists preference and we arent in a “dim surround” its a normal office enviroment/lighting , as people dont like to sit in the dark all day, some departments use daylight leds some do not, its just reality of what people like.
Having displays and viewing conditions all over the place and not adopting the required colorimetry is kind of a recipe for disaster, it basically means that you cannot compare anything.
You can make relative comparisons on the same display and find the K1S1 better looking than ACES but them you are not looking at ACES correctly, so it makes the comparison flawed in the first place.
The way I would address the problem would be by introducing hardware calibrated displays at some of the critical positions, and make critical colour and look decisions at those places. The second step and if deemed necessary would be to roll an alternative ODT with a gain like Epic does, I linked the required OCIO ColorSpace to do that somewhere on ACEScentral.
So just a question, your link says epic games using a 1.45 Gain , does that mean a 1.45 multiply in linear?
So practically I would add that to a ocio config using a cdl before the rrt+odt for something called output-srgb(bright) and then I kee the original one and call it output-srgb(dim).
I need to look more into how to do proper dim/bright sourround compensation, is it really just exposure? I thought its a bit more than that.
I agree, and thats how its handled with key positions having the best screens, its always a money decision , to not buy the eizos and stick with whatever is "‘cheap’ and sorta works, seems to be the reality in VFX houses from what I can tell.
We do calibrate the screens to “match” but not to a certain NIT level, as again artists really cant stand those dim monitors and just raise the brightness again… and I can understand that really well, my GUI monitor is currently sitting at 180NIT in a dim surround… I just kinda like it.
I also understand that we cant really compare anything across monitors and viewing environments that makes sense and I am not looking for that either, we arent a DI/finishing place , most of the day people look at geometry and UIs
adding the 1.45 Slope seems to get me more to what I desire, still a bit “harsh” maybe, just wondering if this would actually get me closer to getting the “same image” (you know, visually) across lets say…
A perfectly calibrated high end Eizo thats running at 180NIT in a “bright” surround environment with the 1.45 Multiply to a dark/dim surround Cinema projector thats running in DCI-P3 , using the default aces DCI-P3 ODT.
Guess I need to also check those things when I get back to the office, still working from home
ah i forgot that my screen isnt 100nit but higher so, when I dial it down to 100NIT, and turn on my daylight lamps in my basement office(grading suite spec, with 18% neutralgrey walls and a projector screen as background etc) I am getting really nice results from the 1.45 “gain” . but if I dial it up again I am increasing contrast , so I would need both the increase in luminance to fight the brighter surround as well as a Lower contrast, does that sound about right?
Is there any simplified functions for that? Like 100NIT with dim (10NIT? ) surround to 200NIT bright (30NIT?) surround ? wondering about the maths here .
Funny enough I am really close now to my internal viewing lut that we made pre-aces…
I don’t think it is a proper characterisation of reality, you can certainly find studios where people have thought about the surround and are using hardware calibrated displays. From my personal standpoint, considering that the display is the first interface to the work, it seems to be a total oversight from the company owners to have people losing time and thus company money over poor and uncalibrated displays. Talking the same language starts by seeing roughly the same things
Not really simple maths, it is really the realm of colour appearance models, there are many publications around that, look at CIECAM02, CAM16 to get an idea of what is involved, we have an implementation of both here. Keep in mind that they were not designed to handle HDR imagery. Baselight TCAM is worth looking at because it deals with quite a lot of that.
I just keep hitting a brick wall anytime I put this stuff up for discussion about surround… dim surround and 100NIT lead to psychological issues apparently … Its just not possible to have people work 8Hour days in those conditions I think. if you are in a grading suite for a day, sure … I personally can stand it very well but many really absolutely hate it.
People also like colored walls, and warm lighting in their office, can I blame them for not wanting cold daylight bulbs? not really…
I very much agree with the monitor statement though… But I am wondering how a built in lightmeter will adjust the image based on lighting factors, like can I tell them “you are supposed to be 100NIT at 5NIT surround , gamma 2.2 and have a whitepoint of D65” and it it automatically adapts those things , contrast, whitepoint, brightness based on surround lighting?
My end goal would be of course to see the same visual image across all monitors and projectors no matter how bright someone set them or in what surround environment they are.
Short term just fixing the ODT like I did seems to work really well , practically, The visual difference between P3 projector in a cinema and my gamma 2.2 calibrated screen at 200NIT is definetely closer now, while with the default ODT it was completely out of whack, I ended up not raising the exposure as much as EPIC and lowering the contrast a bit, this seems to be a good sweetspot at least for what we arte dealing with right now, you know first the low hanging fruit
Yea TCAM has aaamazing surround adaptation, generally baselight gets many things right, I will look at CIECAM02 e.t.c.
why is it clipping off the black below 0.0017 ? is there a reason for this? I dont see how I would want this… ever?
Edit : I remeber asking this question before and it was explained that there is a cinema black set to like 0.002 or something along that, really gives us issues in compositing, cant see the edges of dirty mattes and stuff, should never clip anything off really dont like it.
Is there a better way than to make my own ocio config/luts from the modified aces source? Kinda sucks to loose compatibillity with non ocio programms … it can never be easy can it.
Quite obviously, a display does not have infinite dynamic range unlike the scene, so you need to map a portion of the scene to display black and white levels. Pure display blacks do not exist either so…
I’m not sure to understand what you are seeing, surely you are able to stop up and down your comps to assess that it is looking great at any exposure level? If you don’t do that, the colorist will do it for you and will send you back your cg to fix
Yea thats my point, when I raise the exposure level using the output-sRGB as a viewer its clipping values so i am not seeign stuff I should be seeign.
But you know what I was beign stupid, using viewer gains after the the ocioDisplay Node I was using for testing not the viewer process itself so I was gaining the output of the viewer not the input… what a rookie mistake its still clipping values though, but I guess that doesnt really matter then …
then I really dont know what the compers where complaining about anymore
But here your Viewer is set to Raw, so I’m assuming that you are applying the ODT with a node, this you are exposing Display-Referred values, i.e. the values post display rendering transform, ain’t gonna work
yea sometimes you spend hours to fix a issue that never existed in the first place
Anyhow I even made a “fix” by using acesCCT instead of linear->log2 48nit.(which is the same as acesCC apparentlly)… … plus a CDL to map 0 lin to 0 display
I was so short of changing the CTL “cinema black” in the aces source and making a ocio config from source… already started building stuff from source…