In the third and final part of Red Shark Technical Editor, Phil Rhodes' examination of the complexities of the Academy Color Encoding System (ACES), we look at how it's working out in the real world.
As we've seen, raw ACES data is not intended to be directly viewed, although it can be worked on by things like colour correction software, so long as that software knows what it's working with. Often, grading controls intended for log or Rec. 709 data don't work quite as expected on linear ACES material, but this is really a user-interface consideration that can be allowed for by software engineers.
To enable ACES to be viewed, with all the features people tend to want, the Academy has defined three further stages of modification: the look modification transform, for creative colour control; the reference rendering transform, which takes the high dynamic range ACES data and turns it into a format suitable for displays in general; and the output device transform, which takes into account the characteristics of a particular display. In some situations, the reference rendering transform and output device transform are combined into one operation which will then apply to a particular monitor. It's also important to remember that we're speaking of the theoretical pipeline here; any or all of these stages may take place inside a camera to feed its monitoring outputs.
One key consideration is that some of these things will need to work on film sets, where imaging devices such as cameras and monitors are generally connected together with SDI cables, which support, most of the time, 10-bit signals. This is not really enough to contain the unadulterated ACES material without risking quantisation noise (that is, banding), so the Academy has defined a further standard, ACESProxy. This compresses the dynamic range of the full ACES data to the point where it can reasonably be sent over SDI (or, presumably, 10-bit HDMI), processed through devices such as Fujifilm's IS-Mini box or a monitor with inbuilt lookup tables, and displayed. The Academy stresses that this approach is designed to make a DIT's life easier, and should not be treated as a recordable, final signal.
So far, so good. Defining a solution to a problem doesn't immediately solve the problem, however, especially when it is so highly reliant on the involvement of manufacturers. Display manufacturers are, we might reasonably suspect, likely to be extremely cooperative in providing (or building-in) output device transforms for their displays, since they have long been the point at which lookup tables can be loaded in order to solve problems that may have come into being at points earlier in the chain. They're used to being maximally flexible and solving problems.
Camera manufacturers, on the other hand, have long been keen to emphasise their differing capabilities in pursuit of the competitive edge. Companies building cameras for the very highest end, including Arri and Sony, have so far been keen to get involved, as have Blackmagic, perhaps because they ship both cameras capable of high dynamic range work and the Resolve grading software wherein much of the technology we've been discussing is implemented. Canon announced ACESProxy output on the C500 in 2013, although in general, ACES is less complete on set than it is in postproduction. Still, these are manufacturers catering to informed customers with a need for correct colour management and, hopefully, the expertise to implement it.
It's fair to say, though, there's been a certain amount of bet-hedging by organisations who must continue to support and even expand on non-ACES workflows because they are still popular. For instance, the Sony FS7 provides (or with an imminent firmware upgrade, will provide) the S-log3 luma curve and SGamut3 colour (which itself is much larger than more traditional gamuts such as those found in sRGB and Rec. 709). S-log3 is effective, being designed to come close to the Cineon log curve that's been in use for a long time and is widely supported in colour correctors, although that approach has nothing to do with ACES.
Input device transforms for these sorts of situations – to put the S-log3/SGamut3 image data into ACES – exist, and can ease considerations over the easy gradability of wide-colour-gamut material such as Sgamut3, but there's still at least the same amount of settings to make and buttons to push. The key aspect of it is that ideally, all those settings, as opposed to referring to many proprietary standards, should simply read “ACES”, even if that isn't quite the case yet.
And yes, even though the Academy has been working on ACES for, arguably, around a decade, it still feels recent, having just made version one. Perhaps it is a little optimistic to hope that all the world's equipment manufacturers can be persuaded to provide the information allowing all of their data to be put into a single, unified manipulation environment. ACES is trying to solve a very broad class of problems, and it's not quite as simple yet as it ideally could be. Still, it does at least create a situation where correct results are possible, even if much of the old complexity still has to be there because someone, somewhere will need all the options.
Perhaps in the future, especially if high dynamic range displays such as Dolby Vision gain traction, we could achieve a situation in which all film and television equipment, from acquisition to display in the home, worked according to a single, unified standard with wide gamut and high dynamic range. It would certainly be nice. Until then, ACES at least offers the capability to solve the interoperability problems of non-709 workflows, though it is, like any standard, only as valuable as the degree to which it is adopted.