We’ve been having a lot of fun lately writing about 8K. Most of us at RedShark think, essentially, that it's a good thing. But this is not a binary position. The arguments are highly nuanced.
Right now, when we talk about 8K, it stirs up some strong emotions in our readers, and they’re sometimes far from positive.
Worryingly, some readers have accused us of positions that aren’t the case. Like “you’re just talking up 8K to force us into buying more kit that we can’t afford”. And “you make it sound like 8K is as obvious a thing to the average viewer as the leap from SD to HD’.
We don’t believe either of these.
Nor do we think that the whole world - including the vast swathes of it still watching broadcast SD - is suddenly going to upgrade to 8K because it’s so wonderful and suddenly so affordable.
I want to deal with these and explain our actual, far more subtle position. I’m very happy to discuss this with people that disagree with us, but I’d rather so it on the basis of facts rather than assumptions.
Well, it’s certainly not because 4K is no good. Far from it. 4K is stunning, even more so when it’s not compressed almost out of existence to the extent that you’d be better off watching HD from a Blu Ray rather than horribly compressed 4K.
But 8K is certainly better. You can’t argue about that, unless your definition of “better” is to have fewer pixels and hence less detail.
How obviously better it is depends on the circumstances, of course. On a small screen, at a distance where you’d struggle to spot any improvement in 4K over HD, then you’re not going to see much improvement, if any, with 8K.
But on a display that’s big enough to make HD look terrible and where you can see the pixels in 4K, then good 8K will be an obvious improvement.
I don’t think there’s anything controversial about this. But just in case, let me just put this in a priori (analytical) terms.
4K has four times the number of pixels compared to HD. If you put four HD panels in a rectangle you will have the same number of pixels as 4K and your ad-hoc video wall will be twice the width of the HD screen. Here’s the thing. The resolution in terms of pixels per unit of area on the “4K” screen will be the same as on the original, single HD screen. Of course they are: all the screens are the same.
What this means is that to have the same perceived pixel resolution on a 4K screen at the same viewing distance as an HD screen, you need to have a 4K display that’s twice the width.
There’s no argument about this. It’s a mathematical certainty.
Now take sixteen of those original HD screens and put them in the same shaped rectangle as the original screen. That’s four horizontally and four vertically for a total of sixteen. At the same distance as you were viewing the original HD screen, these sixteen screens - four times the linear dimensions - will give you the same pixel density as the single HD screen. Watching this sized screen in anything less than 8K from that distance will result in lower quality for a given area than on the solo HD display. (Perhaps we should say that in an ideal world, you need to have slightly concave screens, so that you’re the same distance away from the centre and from the edges).
This scenario deals with all the arguments about resolution, pixel density and viewing distance.
You might want to say that this is an artificial, contrived set-up, but it’s an important one, because it gives us a base-line that we can all agree on.
Moving beyond this, we get into territory where we talk about how much increasing the number of pixels on the same size screen improves resolution.
As ever, it depends how close you’re sitting to it. Let’s approach this incrementally and imagine that you’re close enough to an HD screen to see individual pixels. Clearly, if you make those pixels smaller with a higher resolution display, you will see an improvement. Nobody has ever done it like this because you can’t just buy a monitor with, let’s say, 5% more pixels. They don’t make monitors like that, so you will have to do this in your head. But it’s hard to see how you could disagree with this.
Now apply this principle across all scales and distances. At the point where you can just see pixels at any screen size, making those pixels smaller and hence more numerous will inevitably make the picture better.
Well to take this to its limit, you could theoretically sit so far back that the TV is just a single point of light. I make this reductio ad absurdum point because nobody is going to say that you can see detail at any distance. Results will vary depending on your spacial relationship with the screen.
So far, so much talk about pixels. But pixels aren’t the whole story.
It’s common, and perfectly OK to think of pixels as the “atoms” of a digital video image. Digital video is quantised spatially into pixels, and also into discrete levels of brightness and colour.
But our brains (and ultimately our minds) don’t recognise pixels as a valid currency. We interpret the tableaux of images that we see. How we do this is complicated. It's probably fair to say that we're nowhere near understanding it - although we're getting closer. But the point is that it would be wrong to say that just because we can't see the additional pixels offered by higher resolution, there are no benefits. At the very least you can say that if more pixels mean more details, then those additional details give more information to our perceptual processes - even if they're subtle and mostly only implied.
So that's it. That's our position on 8K. We like it because it's an expression of our ever-increasing cleverness with imaging technology. We like it because the pictures - given a chance - genuinely and provably look better. And - just to re-iterate: nobody's telling you to sell your 4K camera and TV for updated 8K versions. If and when it does become necessary, it will be at a time when you will know you want the benefits too.