8K is most certainly here, and it isn't going away any time soon. But its ultimate success or failure could ride on that most basic of things, the humble HDMI cable. Can the standards keep up, and will the solutions being put forward be detrimental to the resolution being craved in the first place?
If there's a central question currently facing film and TV technology, it's the usefulness of ever-larger resolutions. It's actually quite easy to find people who recognise and value the extra capability of 4K, taking advantage of the ability to sit very near a very large display and experience some kind of immersion in, say, a sports stadium, without the VR headset. We can't yet make the same statement about 8K, though, because it simply doesn't have the level of adoption we'd need to see in order to form an opinion
Currently (in December 2018) that adoption doesn't seem to be a foregone conclusion, at least in terms of 8K displays showing 8K material. Quite apart from the widespread caution over the sheer usefulness of that many pixels, sales figures have been, well, frankly, rather disappointing. Yes, current 8K TVs are expensive, reaching easily into five figures, but that isn't particularly unusual for such a new technology. There are currently only two players, Samsung and Sharp, in the market.
It's worth a quick word about Samsung's QLED branding, which seems rather uncomfortably close, typographically, to OLED. QLED describes a backlight technology using LEDs based on quantum dots, which we talked about back in 2014. Quantum dots are a perfectly valid technology which has also been used by Sony, starting in their XBR television series, and in LEDs for lighting. The underlying (or rather, overlying) technology in Samsung's displays is, however, still an LCD. For their part, sharp were the first company to show an 8K consumer display, their 85-inch model, at CES 2012, though such things have only hit showrooms in the last year.
Making LCD display panels with ever-larger pixel counts is really a matter of scaling an existing process. The high price of current 8K displays is likely at least partly due to the demand for defect-free displays – remember when manufacturers used to make excuses about dead pixels on even tiny LCD displays? An 8K display has 33 million RGB pixels and thus very nearly 100 million individual LCD cells, and some LCD structures would might even double that number again, depending how the LCD in question is built. Yet, we're now capable of making 4K displays without it having to cost a fortune, so perhaps the 8K production lines can be made reliable with a bit of time and experience.
Still, even taking into account the cost issues, sales forecasts have fallen precipitously even since April this year, when pundits were pushing the idea that more than eighty thousand units might ship. Small fry, perhaps, compared to a global TV market in the millions, but significantly larger than the 18,000 or so 8K devices which actually sold.
Why?
For a start, there's an almost complete lack of content: YouTube has 8K material, and the super-keen technology enthusiasts at NHK managed to put 8K on air at the beginning of this month, but that's essentially it. There's also the small matter of moving 8K pictures from (say) a computer playing YouTube videos, to a display. Most, if not all, of the current displays use some variant of HDMI 2.0, which achieves about 14.4Gb/s of bandwidth. That's plenty of space for 4K images at up to 60 frames per second, though 8K formats seem to be pushing for even higher frame rates and, of course, involve four times the number of pixels.
HDMI 2.0 can handle 8K at up to 30fps, assuming 4:2:0 colour subsampling. It might be reasonable to ask for at least 60fps, which HDMI 2 could not achieve. Frame rates below 60 make mouse cursors feel laggy, and for the time being, many 8K displays are likely to spend at least some time showing a mouse cursor. The interesting point here is that HDMI 2.1, which achieves a maximum data rate of 42.6Gb/s, can't go any higher than 30 (RGB 4:4:4) frames per second at 8K either.
Or rather it can, but only using something called display stream compression. DSC is described as “visually lossless,” meaning “not lossless,” and it is required to make ever greater resolutions practical over affordable cables. The question is whether the extra resolution is being effectively undone by the compression. It's probably safe to assume that it won't be entirely undone, since DSC is a fairly gentle 3:1 codec, though we should watch closely to see if the industry prefers to surreptitiously wind up the ratio in order to advertise bigger numbers without doing any real engineering.
The even more interesting question is, as ever, where this all stops. The resolution treadmill cannot go on forever. HDR is a valid attempt at an alternative, but even that has limits. Whether we settle on the idea that 4K is enough, or whether the predictions of million-unit 8K sales by 2020 are accurate, is hard to tell at this point. But the train will will stop, at some point. What happens then is anyone's guess, but it'll be a bigger upheaval than simply doubling the number before the K.
Image: Talaj / Shutterstock
Tags: Technology
Comments