Here's another chance to see this article about how there's more to HD than we realise. It might be that we're just not using it properly!
Is HD good enough? It might be. It certainly looked good enough around eight or ten years ago when most of us saw it for the first time. In fact it looked incredible: it was hard to imagine that one day all video would be like this. So what happened? All the talk today is about a standard that's four times better than HD (in fact 4K is only twice the linear resolution of HD).
Well, nothing happened. We just got used to it, and our expectations have climbed. That's if you exclude the apparently still quite large number of viewers still watching in SD on their HD sets, blissfully ignorant of the true nature of HD. These people, one imagines, will not be rushing out to buy a 4K TV any time soon.
The only sense in which HD isn't good enough is if you were to buy a TV that's four times the size of your current one. To keep the same visual quality, you'd need 4K instead of 2K. Such comparisons rarely arise in the natural world though, and meanwhile, rather surprisingly, most of us watch films in the cinema in a resolution that just about scrapes the bottom end of HD (unless we're in a theatre that's specifically equipped with a 4K projector showing 4K sources).
But none of the above is the real point of this article, which is to argue that there's plenty of mileage left in good old HD. It's just that you have to use it properly.
The first time I ever saw Digital Betacam footage from a TV studio, it was the best thing I'd ever seen on TV, by far. If someone had told me it was HD and not SD, I'd have believed them: it was so sharp - and every single line had quite distinct information in it. It looked nothing like the fuzzy, mushy SD that arrived in viewers' living rooms in the days of analogue TV transmission, and equally nothing like the scrappy, pixelated, artefacted stuff that we've become accustomed to since Digital TV became the norm (in these parts, anyway).
That's part of the reason why DVDs still look so good: there's more bitrate available for playing back a DVD. While it's still compressed in pretty much the same way as Digital Terrestrial or Satellite TV, it isn't constricted in the same way. Clever variable bitrate encoding ensures that there are no bottlenecks, and so bad compression artefacts are rarely seen.
Blu-Ray, which is true HD, is a big step up from DVD, and can look absolutely superb. You rarely hear people saying as they watch their Blu Ray edition of Avatar "if only it could be a little bit sharper". You probably wouldn't hear it much if they were projecting their Blu Ray onto a fifty foot screen, either.
There's a lot in common between digital video and digital audio. What audio concedes to video in bitrates it claims back in complexity when you consider that a typical digital studio session will have perhaps 48 or even 96 tracks. It will probably be recorded at a higher sample rate than CDs (96 or 192 KHz as opposed to 44.1 KHz) and a greater bit depth (24 bits as opposed to 16 bits).
In the early days of digital audio as a domestic format, we had CDs (44.1 KHz, 16 bit) and - for a while, until the format was scuppered by piracy paranoia - Digital Audio Tape (also 44.1 KHz, 16 Bit).
The problem with early digital audio recording was that if there was an unexpected peak in the sound, it would use up more than the avaialable 16 bits, and would crash into a hard brick wall. Digital distortion is very unpleasant and needs to be avoided at all costs. So early digital audio recording engineers played save by treating their recording media as if it only had 12 bits maximum. This gave them a margin for safety and lowered stress levels in the studio.
But it also lowered quality! The fewer bits, the courser the sound. It's all very well if it's a fairly consistent, loud recording because even 12 bits can do a good job, but if there's a quiet passage, like a softly blown flute or a piano chord decaying into silence, then you can end up with only two or three bits available to describe the instantaneous sound levels, and that isn't enough.
When this happens, what you hear is digital noise as unrelated harmonics are generated by what becomes - whatever the shape of the original waveform - a blocky staircase. It's exactly the same as you see in digital video when there are't enough bits available to accurately describe the subtle gradients in a blue sky, for example. It's called "quantization distortion" and it's very, very unpleasant.
So, much of the time, the 16 bit medium that was CD audio, was treated as a 12 bit medium.
What a waste!
But did it really make much of a difference? Yes, it did.
Some time in 1987 I became the proud owner of a Japanese-made DAT recorder. It was an AIWA Excilia. When I say it was Japanese, what I mean is that it was imported directly from Japan, because they weren't available in the UK at the time. It was the heaviest thing I've ever had to lift, and that was excluding a giant transformer to reduce the UK mains voltage of 240 volts to the 110 volts that the recorder expected to see.
It came with a demo tape of various genres of music (I think the label was DMP), and one of them was a jazz big band session. I've never heard anything like it.
It started off pretty quiet; so quiet, in fact that you had to turn the volume up. What followed was about four minutes of very percussive, punchy music, superbly performed: this was a very tight band.
And then, after a kind of "false" ending, it started up, with the loudest, most piercing, shrill note on a single trumpet that I've ever heard in my life.
Remember that you had to turn the volume up because the start of the piece was so quiet. It was easy to forget that the volume was so high because the background noise was so low. It was silent, in fact. So when the hyper-loud trumpet appeared, it was as if the bell of the instrument was actually inserted into your ear. It really was that loud.
Now, for all those who complain that 16 bit audio isn't enough, then they should listen to this track. There is more than enough dynamic range in 16 bits to deliver almost any audio apart from the big bang itself, but it has to be mastered correctly.
Yes, there is a big difference between 24 bit audio and 16 bit. But almost any 24 bit recording can be reduced to 16 bits if it's done well and sympathetically.
Now, that's audio, and we were talking about dynamic range there. If you want to talk about resolution, then you have to look at sample rates instead. The effect of higher sample rates with audio is fairly predictable: you can reproduce higher notes. How much that matters is a matter of debate, because even with 44.1 KHz sampling you can reproduce almost all of the typical human hearing range. What you can't do is guarantee that you'll be free of nasty aliasing effects due to the filtering you need to stop any frequencies that are more than half the sampling rate creeping through. When this happens, you get audio aliasing - exactly the same as the "waggon wheel" effect when the spokes move past the camera at the same rate, or faster, than the film's frame rate. If they move faster, they will appear to be going backwards, and when that happens with audio, it sounds terrible!
That's another discussion though. Let's turn now to video.
Do you remember the amazing TV series "Planet Earth" that first appeared around the end of 1996 if I remember correctly? It was one of the first wildlife documentaries to be shot in HD, and it was hugely sucessfull, not only as a program in itself, but as a way to demonstrate how much better HD was than SD.
There were long, swooping shots of flamingoes - millions of them, and it looked as though every bird was visible individually. Aerial shots of wildlife in the savannah took on an almost abstract form as the tiny animals below cast long shadows in the sand and left footprint trails that made the most intreguing patterns.
This was - and still is - breathtaking.
And this is the point, surely: that if you make a film that's designed to exploit a format, it will push the percieved abilities of that format much further than if you merely make things the same way you always have done, without changing anything.
There is a lot of room left in HD for wonderful, high quality pictures.
So where does that leave 4K? Do we really still need it?
Actually, I would say that we do, more so than ever.
For a start, when you capture footage in 4K, it doesn't have to stay 4K. If you downsample from 4K to HD, the HD will look better. In fact, a small miracle can happen, where 4:2:0 4K video can become 4:4:4 HD video when it's downsampled. Even if this HD video is then reduced to 4:2:0 again, it will look better because all the approximations by the codec will have been based on a higher quality source.
Remember that there is more to the quality of a digital image than just the number of pixels. Ultimately, the sensor, although an analogue device, is just a stage in the journey of the sight of a scene "out there" to its resting place on our retina (or, more correctly, in our perception).
The lens, lighting and the environment contribute as much, if not more, to the look of an image than the arrangement of pixels on a sensor. The sensor and its associated digital electronics and software have to be good, of course, but everyone who's worked in video for more than ten years must remember that even with SD, if you use a good lens, it looks better than if you use a cheap and nasty one. And you can always improve things with lighting!
And, just occasionally, or perhaps more often than that, it will be wonderful to see video that is even sharper, even more amazing than HD. Perhaps it will be to see some amazing coral reef, or every single blade of grass in a wildlife documentary. Imagine what it will look like if a production is made specifically to show off the resolution of 4K.
But remember that perceived contrast, saturation and sharpness matter more to viewers than absolute resolution. Just look at that Bentley ad shot on an iPhone!
Perhaps High Dynamic Range video is the next big step.