All this talk of 4K, high frame rate, 3D, and ever more precision and information in video signals is a good thing, right? Well, yes, but at some point, an engineering solution has to be designed to let it happen, and there's reason to believe we might be getting a bit ahead of ourselves
NHK's Super Hi-Vision 8K camera prototypes need an armload of SDI cables to deal with the torrent of data produced by such a high-resolution device, and until recently, 4K was just as difficult. Postprocessing and storing camera data at beyond-2K resolutions can still make for cumbersome workflows. At NAB, Blackmagic released a whole load of 4K-capable gear using the new 6-gigabit SDI standard that had, at the time, not quite been finalised. And finally, right at the end of its press conference, Sony announced its new 4K TVs. Early-adopter prices are no surprise at this point, but they're still quite a lot cheaper than most of the 4K displays – such as Astro Design's DM-3410 series – that have existed to date.
This is therefore an interesting announcement from the perspective of someone whose lounge has a very big blank space on one wall. More interestingly for us, if you happen to be a cameraperson in need of monitoring whose clients are clamouring for 4K but aren't particularly enthusiastic about paying extra for it, the applications of this sort of thing are clear.
There are, however, issues. Few current 4K camera-recorder combinations provide HDMI outputs, and even if they did, current 4K consumer displays invariably use version 1.4 of the HDMI standard. With the torrential amounts of data that 4K images represent, that's sort of a problem. HDMI uses three channels of digital signalling which were originally intended to each carry an RGB channel. Each of these links is clocked at up 340MHz using an encoding scheme that uses ten bits of signal to transport eight bits of data with a degree of error detection. Each link is therefore capable of shifting 2.72 gigabits per second of data, for a total of about one gigabyte per second for the whole HDMI connection.
And how does this compare to the demands of a 4K signal? Well, a 4096 by 2160 frame, which is what the HDMI specification works with, is about 25.3 megabytes of 8-bit RGB data. This means that an HDMI 1.4 connection is capable of transporting 4K frames at an absolute maximum of about 40 frames per second, and practical concerns will reduce that further. Slightly lower-resolution Quad HD modes at 3840x2160 are supported, but the reduction in pixel count would only raise the limit to about 44fps. People anxious to watch a potential Avatar sequel at 48 or 60fps are still out of luck.
People who are into high frame rate material or certain types of stereoscopy should already be worried. In a practical sense, the only 4K modes which are actually specified in HDMI 1.4 go at 24 frames per second. This could be increased using some of the subsampled, component YUV modes which are specified in other parts of HDMI, although that would represent an adulteration of the data and raise the spectre of colour gamut issues, as well as begging compatibility problems with various equipment.
It's possible to aggregate more than one HDMI connector together, as some early 4K devices have already done. Apple's 30-inch Cinema Display used two DVI ports to allow compatible Macs to drive it to full resolution and at high frame rate. Theoretically, there's also a dual-link version of HDMI which puts a total of six (rather than three) data channels down a single connector, but to my knowledge it has never been implemented
Happily, 2006 is practically prehistoric in the world of cutting-edge audio-visual gear, and HDMI 2.0 should nearly double the maximum speed of the link to 600MHz. Still, televisions sold until HDMI 2 is implemented will inevitably use HDMI 1.4 inputs, so don't expect to display 4K on them at 120 frames per second: regardless of the capabilities of the display panel itself, it won't fit down the socket on the back.