Virtual Roundtable: ACES for TV/Broadcast/OTT

**

Originally published at: https://acescentral.com/virtual-roundtable-aces-for-tv-broadcast-ott/**

Please join us for a Virtual Roundtable discussion on the topic of “ACES for TV/Broadcast/OTT”. Wednesday, May 20, 2020 10:00 am PDT (GoToMeeting instructions below) We’ve had a request to explore some possible use-cases and basic technical requirements for using ACES in live broadcasts, multi-cam shoots and other situations that were formerly described as “broadcast”…

Please feel free to discuss ACES for Broadcast in this thread!

Excellent Steve. Shall we share here the document @nick and myself wrote prior to the meeting to start the discussion here?

3 Likes

Here it is. Looking forward for your comments and opinions.

ACES for TV/Broadcast/OTT
Multi delivery and Archive challenges of the
Broadcast Industry.
Virtual Round Table May 20th 2020
Nick Shaw - Pablo Garcia

Introduction:
The Broadcast industry is going through one of it’s biggest conversions in history. The coexistence of
so many distribution standards (SDR, UHD-HLG, UHD-PQ, etc…) has created a requirement for the
broadcast industry to deal with multiple deliverables (SDR-HDR). This challenge used to be exclusive
to Scripted and Post-produced content.
There have been many different (and very often, very expensive) approaches trying to solve this need.
Dual-layer production, Single-layer production, fully separate production, scene-referred, display-
referred, etc…to name a few.
In an industry heavily led by manufacturer’s marketing and big broadcasters’ influence, an
independent, solid and tested standardization has become a worldwide request.

Main areas to discuss:
ARCHIVE: DOWNSTREAM COLOR SPACES.
When television distribution progressed from standard to high definition, many studios were able to
go back to an original film negative, with its inherently greater level of image detail, to create and sell
new high-definition programming. So when an original film negative is the archived master, it can be
used to render new display masters for downstream distribution targets. In contrast, digital
workflows tend to finish their masters with the colour limitations of the displays on which they will be
presented, and have therefore left us with an unfortunate legacy.
If a long-lived motion picture or television program shows signs of greater longevity, re-releasing will
be on executives’ minds. But like the overgrown digital jungle, digital distribution channels continue
to change and new display opportunities will arise.
Future-proofing of these assets has become a major concern for executives all around the globe.
A discussion of an appropriate file container, specific for this industry should be considered.

COLOUR MANAGEMENT:
Live Broadcasting and Non-scripted content are the main fields of the scope of this study.
The jungle of colour convertor equipment doing slightly different things due to the blurriness of the
standardization of HDR, manufacturers adding their proprietary look in a world where workflows and
equipment are often mixed, etc… is putting the whole industry under a significant strain in cost
(equipment, R&D, etc…).
Independent, industry-approved, standardized approach to colour management should solve most of
these problems.
Fields to study and consider would be: Graphic content creation, File-based workflows, OB
workflows.

Points of consideration:
Today’s difficulties to implement ACES in Broadcast environments:

  • RRT is too “cinematic”

  • Need for values over 100% IRE and under 0% IRE. Perhaps there could be optional ODTs
    which rolled off to 105% as per EBU R 103?

  • Transforms for Playback Archive: This could be mitigated by the creation of a pseudo-log
    format (HLG-like? 10bit) which looked "reasonable" on a Rec.709 display without LUT
    capability. This encoding curve would not need a dynamic range anywhere near as great as
    ACEScct. Maybe an intermediate format of 10bit ACEScct-proxy until a new HLG-like curve is
    developed.

  • Long-Gop codecs used for archive (10bit). Container for 10bit (mxf jpeg2000?). Archive.
    Long-Gop archive yes-no?

  • IP metadata (AMF stream?). Could be implemented for both industries once ACES 2.0 is
    implemented as we can see more and more digital cinema cameras that are outputting
    ACEScct for OnSet look management. See SRLive Metadata from Sony.
    Could there be some metadata flag which indicated that a particular image was display-
    referred (and it’s colour space?) and that it should be passed unmodified. Useful for colour
    bars and graphics. But would a dynamic output transform be too hard to implement
    universally?.
    If implemented it could be used to auto-configure colour converter boxes.this could be useful
    throughout ACES; not just for ACES Broadcast.

  • Transforms for archive playback. ODT’s to live in equipment (CLF)??.

  • Low-end broadcasters might struggle due to infrastructure.

1 Like

In my mind the ideal scenario is that all equipment lives within an ACES ecosystem. Cameras apply an IDT internally to get into ACES before their physical output and graphics are created directly in ACES (or ACEScg with an appropriate transform before output I suppose). All equipment that handles the data (routers, vision mixers/switchers, record and playback devices [e.g. instant replay]) keeps everything in ACES space, then the RRT and ODT’s are applied at final broadcast output, with the recorded archive being kept in ACES.

In this scenario there are (at least) two major hurdles:

  1. You have to have a color corrector/LUT box to perform the RRT and ODT on every single display/monitor that you view images on inside the facility, broadcast truck, etc. This is a substantial amount.

  2. Current SDI standards are based on 10-bit data (nominally, although some 12-bit use is accepted), which is what the vast majority of broadcasters use, but ACES is 16-bit. As things move to IP-based architecture (particularly ST 2110), the bit depth problem becomes less significant as the standard includes support for multiple bit depths (including 16-bit), although bandwidth would still be a concern.

2 Likes

Welcome to the ACES Central community @garrett.strudler!.
Many thanks for your message and thoughts, hopefully we’ll have more and more people dropping their ideas and wishes to this (hopefully happening) development.

About your ideal scenario. That’s the basic initial idea (there’s many other of ACES aspects that could be beneficial (metadata, etc…).

We need to remember that this is just a first contact with the broadcast industry to evaluate the needs and possible development of an ACESbroadcast specific standard (many notes about it on our previous post).

1.You have to have a color corrector/LUT box to perform the RRT and ODT on every single display/monitor that you view images on inside the facility, broadcast truck, etc. This is a substantial amount.
This could be implemented by manufacturers in many ways (like an ACES EOTF in monitors…), so cost shouldn’t be any issue (as far as manufacturers don’t go crazy in charging for a firmware update!). This is all very much depending on the industry putting pressure on manufacturers to implement ACES into their products…
2 Current SDI standards are based on 10-bit data (nominally, although some 12-bit use is accepted), which is what the vast majority of broadcasters use, but ACES is 16-bit. As things move to IP-based architecture (particularly ST 2110), the bit depth problem becomes less significant as the standard includes support for multiple bit depths (including 16-bit), although bandwidth would still be a concern.
This is noted and a 10bit (new development) standard should/must be created as noted in the previous post "This could be mitigated by the creation of a pseudo-log
format (HLG-like? 10bit) which looked “reasonable” on a Rec.709 display without LUT
capability" . Hopefully 12bit and specially IP as you mention will help in the future to expand and improve the compatibility with existing ACES workflows.
Looking forward to hearing your comments.

Hi @garrett.strudler

Just another note on point 1, you would not need a lutbox per monitor necessarily, if you had a bank of monitors set up for 709 and a bank set up for 2020 HLG, you can have 2 lut Boxes total and switch/DA the two LUT box outputs to the two banks for example.

Monitor manufacturers are slowly taking direct support for different inputs and in time as Pablo said it should take the LUT box out of the equation on the output

1 Like

Monitoring

One of the issues is a large portion of monitors used in broadcast facilities and trucks (at least the ones I’ve seen/worked in) are not professional video monitors, but often consumer displays (either computer or TV depending on the size), so I’m not holding my breath that they will provide ACES implementation in the future. The reason why they do this is for cost, obviously, but also because they are generally displaying multiviews with multiple sources on each display (60"+). The advantage here being that a control room wall may only have 6-8 displays that need conversion (generally more for trucks because they tend to use a larger quantity of smaller displays). True production monitors (e.g. Sony PVM OLED) that are color accurate tend to be reserved for the Program and Preview monitors for the Director, monitors for the Video Engineer (shader), and engineering, and possibly a few other places, which may very well be able to have in-built LUT processing.

That only works if the monitors are all showing the same source, which is not often in live broadcast. Let’s say for example they are setup as four “pairs” of monitors, with one showing 709 and the other 2020 HLG. If those are supposed to show four different cameras, then each pair needs its own conversions (either two LUT boxes, or a LUT box with two discrete outputs) since your are dealing with four different sources. As I mentioned above though, this scenario would likely be pretty rare, as those four cameras would all be in one multiview and most monitoring would only be in one colorspace (let’s say 2020 HLG) so at that point it’s actually only one LUT box. Where you might see the above example is in master control facilities that need to critically monitor multiple feeds.

Dual Production?
Another thought I’ve had (which would be more easily accomplished in an IP infrastructure) is something closer to a dual-production approach, in which sources output both an ACES-compliant format (ACEScc seems like a good candidate) and a display-referred output. Any display used for non-critical monitoring (most) gets the display-referred output directly, so no LUT needed. The ACES output goes to devices in the production pipeline, like vision mixers, video recorders, etc so that everything stays in HDR and WCG through to archive. Logistically in baseband video (SDI) this could get messy (you’ve now doubled your source count and have to route the correct sources), but in IP workflows it may be fairly trivial with metadata tagging as the receiving device (display) could “ask” for the display-referred output from whatever source it was getting. Theoretically in IP workflows the display-referred output could also be a lower bit-depth and/or lower bitrate to save on overall bandwidth.

ACESproxy?
Is there a reason ACESproxy doesn’t already “fit the bill” in SDI-based workflows? I don’t know why I didn’t think of it earlier, but that’s precisely why it was created, right? Certainly it’s more limited being 10-bit non-float, but it would be fairly easy to standardize on since it’s already out there. Theoretically you could do it as 12-bit (integer) as well if all the equipment supported it.

Metadata

I think the idea of metadata, particularly in IP workflows, will be very helpful, obviously. The one area for sure where this gets sticky is when signals get processed through a vision mixer/switcher. What happens when you combine two images on screen like a picture-in-picture or split screen? Or even worse if something is semi-transparent like a graphic element and you can see both sources at the same pixel location?

Hopefully I don’t sound like too much of a downer, ha! I really like the idea and implications of something like ACES in the broadcast world and what that means for both simultaneous multi-format broadcasts, and archive.

Hi all - sorry that this took so long to output a recap of the meeting - it’s my fault since I’ve been slammed with other things. For those of you who attended, feel free to add anything that I may have forgotten and edit anything I may have gotten incorrect.

Recap of ACES for TV/Broadcast/OTT:

This meeting was to continue the discussion from the HPA Breakfast Roundtable on whether ACES needs any modification to be used by broadcasters. Pablo Garcia from Mission Digital presented his use cases for ACES in live broadcast with multi-camera and multi-outputs and brought up the points that are listed in the discussion above. Alex Frausto from Netflix explained his need for color management for non-live productions which are also multi-camera and multi-outputs, as well.

There were no conclusive results from the discussion, but there does appear to be some challenges with color management in both live broadcast and non-live posted workflows. More discussion is needed, an ACES 101 for people unfamiliar with ACES would be helpful for the discussion and some diagrams that compare the current workflows for live, non-live and ACES /film would be helpful to continue the discussion.

Some of the challenges brought up:

For Live (in addition to the ones that are in Pablo’s posting)

  • “Jungle” of color converter equipment – which is the right way?
  • Looks are determined by the camera manufacturer but making different ones match is complicated
  • What to archive?
  • Lots of re-use of old/existing and new assets
  • Everyone uses a different monitor
  • Is there an “economic” case for ACES in live broadcast?

For Non-Live

  • Some shows might have 20-30 cameras – not feasible to put LUT boxes on every camera
  • Mix of HDR and SDR monitors on set
  • EXRs are too heavy to use as interchange
  • How would they sell ACES to a DP and director and make sure it’s accurate?

Next Steps
Encourage conversation on ACESCentral and poll the group to see if there is enough interest to call another RoundTable in the next several months.