Using ACES for 360˚ VR

,

having used ACES successfully on commercial and feature projects, ACES has become an obvious colorspace for finding efficiencies in high end 360˚VR projects. I have found it most valuable for matching cameras within the same array for a more accurate auto stitch. additionally we always have a variety of different camera arrays (RED weapon for nodal, google odyssey w/ go pros, zcams, etc), so it has become an effective tool in helping to simplify that process.

thought this could be a good place to share notes and have a discussion about best practices and other advantages of implementing ACES into this new immersive (standard-less!) space.

now if I could just convince folks to finish these projects in HDR…

andrew shulkind
director of photography
vr/ar/mr consultant

Hi Andrew,

you hit on an important topic within the live action VR community. The lack of standards in handling color was mentioned early in the first meeting of the new VR Committee a few weeks ago. It would be great to learn more about the VR projects you did using ACES for VR.

David Morin
Chair, Joint Technology Subcommittee on Virtual Reality

Hi @davidmorin, could you elaborate on:

The lack of standards in handling color was mentioned early in the first meeting of the new VR Committee a few weeks ago.

Cheers,

Thomas

Thanks for the notes. I’ll be doing a VR project soon and would be keen on knowing more of your experiences. What software we’re you using for grading @ashulkind?

Thanks!

We just did one VR project in 2016, with time frames more alike that of a commercial. For this reason we did not use ACES at that time, because of the additional time it would have required to set up a workflow for that and the lack of specific VR/AR technologies supporting it.
Actually, ACES is meant to simplify things ―and in fact it does― but software used for VR/AR applications that I have seen so far has not even reached that maturation point to have a meaninful color management system at all.

Said that, I am more than keen to start a discussion on this.
For example: would colorimetric camera-match benefit from being done directly in ACES footage, or would the color-matching be more beneficially done on original footage using, for example, propietary tools?
I think technically bringing images into ACES as early as possible (even if just as a working-space rather than a full ACES2065 render).is more clean and respectful of the ACES guidelines that were drafted.
Input Transforms, on the other side, are applicable from a “reference” camera for any given model (and possibly shooting settings) into ACES2065, and bring its scene-referred codevalues into ACES codevalues - which are still scene-referred. Therefore the codevalues of footage shot with the same-model and same-settings cameras (like in a 360°rig), once brought into ACES; should all match with each other (already).

I am keen to listen to others’ experiences and/or thoughts as well.

used ACES for two projects, the best case example was for a project we did for the strain at comicon, which was a normal ACES pipeline. the other was for NCAA football. honestly I wish I could use the colorspace even more, but often I come up against a post house that has a different way or working. or we are using cameras that are so new that there isn’t an IDT yet.

overall, we found significant efficiencies in the process and anticipate even more so as these projects begin delivering.

resolve (12, I think?) and baselight

Hey guys, just a little heads-up on this matter:

SCRATCH and SCRATCH VR 8.6, which are both in open beta right now, fully support ACES 1.0.3 incl. ACEScct.
SCRATCH VR in particular has dedicated color tools, and, as of v8.6, also it’s own stitcher (still WIP though) for VR work, that can load stitch-templates from other apps such as PTGui, Hugin, or Autopano.
If you wanna give it a try, let me know.

Cheers,
Mazze

Yeah Scratch is on top of the VR game!

@ashulkind: is there any specific tools for VR in resolve? I haven’t found anything…

Thanks!

I have heard rumblings about assimilate being a potential option for aces vr. great suggestion! one significant issue is that there aren’t consistent pipelines for these workflows. every one of these projects are ad hoc, contributing to the high cost of high end vr. assimilate could present a significant efficiency.

A

Hi Thomas - sure. The first meeting of the Joint Technology Subcommittee on Virtual Reality was held at Jaunt Studios in Santa Monica on January 11th. In his opening statement Miles Perkins, VP communication for Jaunt, stated that live action VR is still in a pioneering phase, with many tools and standards yet to be defined. The first example that Miles mentioned to illustrate his point was color management. He did not elaborate on specifics, but there is a sense that if you shoot with a live action VR rig today you have to handle the output as a VFX shot, with a lot of rig-specific custom work and manual labor taking place between the shoot and the viewing. The hope is that standards can help simplify the process, from color management to stitching to in-goggle viewing, and make this new medium available to more creatives.

using ACES for a new project with the NBA for the all star game. still early days, but we need some IDTs for jaunt and the new phantom!

Only just saw this thread.

We used ACES for the Lego Batmersive 360 experience.

It was the obvious choice as we use ACES for our normal feature animation pipeline.

Only thing we did that was unusual was developing a custom 200nit semi-HDR ODT for the HTC Vive, that also used the wide (but not P3) gamut of it’s panel. Unfortunately the final deliverables in the wild were handled by another vendor, and don’t take advantage of this, but it was cool to see internally.

1 Like

well done, alex! 200nit semiHDR ODT for the vive? i’m curious…how did you guys determine that standard. very interesting. thanks so much for sharing!

Excellent, thanks for sharing Alex!

The 200nit semi HDR tonecurve was created by just reworking the SegmentedSplineParams_c9 section of ACESlib.Tonescales.ctl by eye.

We went through a similar process when we were developing a 250nit ODT for using HP Dreamcolor displays with the backlight maxed out. Just running two side by side, one at 100nits using the standard transform, the other at 250 with the custom one, and tuned till we got something like felt similar in character to the stock transform, but with an additional stop and a half of additional headroom.

To tune the Vive ODT, we recallibrated one of the Dreamcolors to match the primaires, whitepoint, gamma, and brightness of the Vive.

We did that dev work using my Pure Nuke RRT/ODT implementation:

Which lets you tweak it in more or less real time. Then we copy those values back into the CTL so it we can rebuild/rebake our OCIO config from it.

The primaries and white point were given to us by Valve, but you could get them by profiling the headset directly too.

2 Likes