In the real world, light levels vary enormously. Our current screens can't reproduce images that include light sources or reflections of them accurately. To do so would require a massive increase in brightness. But it needs to be done. Phil Rhodes explains the science needed for this next artistic leap
Let's try a quick experiment: Google up a nice picture of the sun, and then see if the light coming out of your monitor is casting your shadow on the wall behind you. No? Well, Dolby may have the answer. Sunglasses at the ready.
It is no secret that effectively all current display devices will fail that sort of test; they backlight on your 24-inch desktop TFT is probably in the 40-watt range, whereas the sun emits something like 380,000,000,000,000,000,000,000,000 watts. Even though your monitor is slightly closer than the sun and the inverse square law is a great and powerful thing, there's no way that any current display can render a photographic image of the sun with literally correct luminosity. For that matter, we don't even need to go to the extremes of our local star. The filament (or arc tube) of a common lighting instrument, especially one with collimating optics between it and the observer, will be many, many times brighter than any monitor can reasonably display. You would, literally, need sunglasses.
Given our near-obsessive concern over dynamic range in camera systems, the dynamic range of monitors would seem like a reasonable target for research. Dolby won the Emmy for their current top of the tree reference monitor, the PRM-4220, so it's no surprise that they've come up with a genuinely arresting idea – a monitor around ten times brighter than the current average. This is currently very much a lab toy because the immensely powerful backlight requires active liquid cooling, but there are already rumblings of consumer versions (notwithstanding anyone's ability to actually make them), as well as talk of new mastering techniques for film and TV productions to allow them to take advantage of the uprated technology.
Anyone who's done any serious thinking about how moviemaking tools interact with the human visual system will already be drawing breath to object at this point, mainly because it's well-known that humans do not see absolute brightness. Our impression of the brightness of an object is subject to our interpretation of its surroundings, and it's a good thing that this relative interpretation appears to work in the context of film and TV pictures because otherwise every technology to date would have seemed unacceptably dull and gloomy. No extant technology, including film with its 16 foot-lambert screen brightness, comes anywhere near a literal representation of reality, but we deal with it without even thinking about it. So is more brightness required?
Well, it can be useful, in the same way that the high dynamic range of audio can be useful, where the availability of great contrasts between loud and quiet sections (and, similarly, bright or dim sections) of a piece of media are a tool made available to the artist by the technology. But the ability to convincingly shock and surprise the audience, or to resolve bright specular reflections, is dependent upon of contrast: a 4000-nit monitor is pointless if it doesn't have significantly more contrast range than existing types, and there hasn't been much talk of the technology requires to make this a practical consumer reality, possibly because that technology doesn't exist. You can make a monitor as bright as you like, but if the black level increases with the white level, the eye will simply adjust out the change.
So the achievement here is not necessarily absolute brightness – you can do that by putting an HMI behind an existing TFT panel, if you can prevent it from catching fire. The issue is contrast, and if, as a side-effect, we get the ability to display 4000-nit highlights, then that's great. This does raise the spectre of how exactly material to suit these displays will be produced and distributed. Current cameras, such as Alexa and F65, already have greater dynamic range than current common displays, though perhaps not quite enough to supply a 4000-nit panel if it really did have black levels similar to the very best current types (meaning OLEDs). Distribution might require extended bit depth to reasonably quantise such a high-dynamic-range signal. We might, if you think about it, end up distributing 10-bit log.
Exactly how we do that is undefined at the moment, although it's easy enough to wind up the characterising specifications of a digital signal, especially in a situation where a lot of future stuff will almost certainly be distributed over the internet, where it's much easier to play with the specifications of the receiving software than it ever was to make everyone replace a television or three. There are already 12-bit modes in HDMI, although it's not entirely clear how many devices (such as graphics cards and blu-ray players) might support them. Some might be amenable given a firmware update, some might not. The display, of course, must be entirely novel. One crucial advantage is that, from an information theory standpoint, it ought to be possible to distribute HDR pictures and derive standard-dynamic-range pictures from them automatically, give or take the need to be cautious when grading to ensure both the HDR and the automatically derived SDR pictures look reasonable.
Like HD, and unlike 3D, it's difficult to argue against the idea that HDR is an upgrade. HDR displays and distribution systems ought to be backward compatible so there are no enormous barriers to adoption. The only barrier would appear to be technical – the sheer practicality of building displays with ten times the backlight power and contrast than we've had to date. In some ways, though, that's quite encouraging: this is a nice example of someone coming up with something that would be nice and then trying to figure out how to build it, as opposed to the distressingly common modern curse of someone slapping a new label on an existing technology so they can sell it more. Top marks, then, to Dolby, for pushing the boat out, and for being a western technology company that's actually doing innovation. I wonder if they can make it stick in the marketplace; here's hoping.
Read more from Phil: Luminance and the role of Brightness in photogaphy