Theatrical film exhibitors have always seemed a little scared of TV. As such, the cinema world has spent much of the last sixty or seventy years attempting to stay far enough ahead of television to maintain its own relevance, giving rise to dubiously-successful technologies such as scratch-n-sniff cards, Cinerama, stereo 3D, stereo 3D, and stereo 3D.
On a more positive note, we owe the very existence of things like anamorphic lenses to the desire of 1950s movie moguls to conspicuously outdo the small screen. It's perhaps something of an irony, given the decidedly lukewarm reception that 3D repeatedly suffered, that there's now a technology that's very widely liked and technologically very possible, but which isn't being taken up in cinema nearly as fast as it is in the home.
The Digital Cinema Initiative's specification for theatrical HDR actually comes off as rather unambitious at first glance. Back in 2019, DCI released paperwork called for a mere 500 nits of output, which is less than some computer monitors that don't even promote HDR as a feature. Even lower levels have been proposed.
It's not even as if more is particularly difficult. This discussion comes at a time when theatrical exhibition is increasingly looking toward direct-view displays – that is, LED video walls – to replace projection. It's been some time since it was possible to make a 4K LED video display small enough to fit in even quite a compact cinema auditorium. The best modern displays have a pixel pitch of under a millimetre, implying a 4K display under four metres wide, which is barely a worthwhile cinema screen to begin with. Larger displays with a larger pixel pitch have more than adequate resolution. Size versus pixel count is no longer an issue, even if we can persuade the studios to actually finish things in 4K.
Best of all, LEDs are massively powerful. Dolby's 4000-nit Pulsar display is widely cited as the brightest video display around, although LED advertising hoardings designed for outdoor use are often specified to 5000 nits for viewability in direct sunlight. As ever, we should bear in mind that HDR is not specifically about brightness, it's about contrast, and LED video walls have much the same kind of absolute-zero black levels as OLEDs. They also have perfect geometry, freedom from focus problems, and are popular with architects who'd rather not give up space in the building for a projection room.
So why isn't it being done? Well, cost is one factor. Perhaps the dawn of 2022, after two years of crippling pandemic-motivated under-attendance at movie theatres, is not a great time to beg for upgrades. Even in a new facility, a quality LED video wall is still more expensive than projection, although the commoditisation of LED panels has driven the cost down massively and is likely to continue to do so. There are also some technical concerns; LEDs don't necessarily quite cover all of the DCI-P3 colourspace, although probably not to an extent that would be massively objectionable. At least two of the Chinese manufacturers, including Timewaying and Unilumin, have LED panels which are DCI approved and are being installed in new-build cinemas.
The other issue is the sheer utility of multi-thousand-nit images, and that's not such an easy point to address. Much as Dolby specified the Vision system to handle up to 10,000 nits, showing such an image in the darkened environment of a movie theatre would invite immediate retina-searing unwatchability. Anyone who's seen the Pulsar in person will understand that even four thousand nits is not a very practical long-term viewing experience in a dimly-lit grading suite.
There's an interesting side issue in that unless the entire environment is painted vantablack, projected cinema HDR struggles to maintain black level. Light directed at the screen is diffuse-reflected into the environment and bounce off anything it can find, particularly if the guy in the front row is wearing a white shirt, and perhaps return to the screen to contaminate blacks. LEDs can do better here, since systems such as Sony's CLEDIS are designed to have deep black coatings on every part of the display that isn't an LED emitter. An inactive CLEDIS looks like a hole in reality.
DCI's current numbers suggest that cinema HDR will aim for brightness in the low hundreds of nits, extremely low black levels of 0.005 nits, and some interesting new specifications in terms of high frame rate material. That is surprisingly relevant to the staggeringly high contrast of which these devices are capable; the judder of low frame rate material becomes more objectionable at higher contrast.
Still, these are not difficult numbers to achieve with any reasonable LED panel. The question remains how it's to be paid for, which ultimately has to come down to increased ticket sales arising from the audience expectation of a superior experience. The challenge is that many people currently enjoy good HDR displays at home, with 4K resolution, high frame rate support and many other features, at consumer prices.
As such, we might worry whether the overall battle is already lost; whether TV has now simply outdone cinema, and whether what the cinema world is trying to do will come soon enough. Again, for an industry that's spent so much of its history trying to competitively out-technologise TV, it still seems odd that film distribution has been so slow to adopt a new development that's well liked and for which the available technology seems to have few drawbacks. That was never the case with 3D.
Anyone with a fondness for the flickering beam of a projector is probably already somewhat concerned at the plight of theatrical exhibition in the Netflix age. The technology exists, though, for cinema to make a massive comeback. Let's meet back here in about 2027 to discuss.