Increased resolution means hitting focus is more critical than ever. But have advancements in resolution outstripped our ability to determine when an image is in sharp focus?
One of the hot topics at the moment is 4K and whether it is needed at all, especially in light of the further push for 8K images. Inevitably, there is a mathematical debate, too, which generally puts me into circles of confusion!
While I was at the cinema the other day watching the latest installment in the Mission Impossible series, I noticed something that effectively puts all debate about resolution into the realm of pointlessness. It was something that I have been noticing more and more. What I was seeing was a multitude of out-of-focus shots. The projector in the cinema was a 4K system, although that the film was shot on Alexas and most probably mastered in 2K. However, the projector was perfectly focussed. When shots were good, they were very good.
However, even on static shots, there were countless times when focus was missed, thereby throwing any potential resolution into the dustbin. Remember that this is a multi-million dollar film with the best equipment and the best camera ops and focus pullers in the industry. If these guys are missing focus by so much and so often, what chance do the rest of us have?
As we go to ever higher resolutions, we need to be aware of our own limitations. Raw focus is not the only factor that can affect the effective resolution of the picture. Put simply, as RedShark’s Editor in Chief, David Shapton, has written here before, as soon as the picture moves the resolution drops due to motion blur.
We can conclude from this that, in order for these newer high resolutions to be truly effective, we need much higher frame rates and, therefore, shutter speeds, but the overarching question is whether the general public would accept them for films and drama. After all, Peter Jackson's experiment with 48fps wasn’t exactly a storming success!
Contrast is another aspect of picture quality that is often forgotten about. This is a strange ommission from most debates, because we know that, in terms of perceiving detail, the human eye is much more responsive to contrast than it is to purely resolution. HDR images will be a must in any UHD future.
In other words, for these new HD formats to really launch into the stratosphere, in terms of the general public's acceptance, and to really show off the detail that they are capable of, they will need much higher temporal resolution and far, far higher contrast capabilities at the display end of things and better focussing tools.
With regard to frame rate, I think we need around 200fps. Yes, I know many people will argue that the human eye cannot perceive much more than 60fps. I disagree. With a 180 degree shutter equivalent at 60fps, there is still much resolution destroying motion blur and even some judder at those shutter speeds. So, we need to go higher still, to the point where motion blur on screen is no different to that which we see in reality. We could reduce movement, but why would we want to stifle creativity in that way? The new age of UHD formats should give us more creative tools, not less.
While I have gone off on a bit of tangent from how this article started, those factors are important. But to go back to where I started, as David Shapton pointed out in his article, it only takes half a pixel of misfocus to drastically reduce the overall resolution of the picture. The fact is that most lenses these days are sharp enough to outperform our ability to focus them to absolute pinpoint accuracy most of the time.
If we are to make these new formats work truly well, we will need all factors combined: HDR, HFR, and a bulletproof method of focussing. Most pro focus pullers would balk at the idea of auto-focus, as would most camera ops in my sphere of work. But, with resolutions available that are clearly challenging the best in the business, we may finally have to say goodbye to manual focussing as we know it. As humans, we simply aren’t good enough any more!
Is it really overall detail in a picture that we are really after anyway? Do we really, honestly, crave it? After all, many people rejected The Hobbit 48fps experiment precisely because it looked too real and lacked the dreamy resolution reducing motion blur of a 24fps projection. Film itself has always been imperfect, with its organic grain structure and softer look compared with digital. We also know that many cinematographers these days go to great efforts to tone down the sharp look of digital, especially when it comes to filming actresses for whom the development of UHD and 4K is a nightmare made in hell!
So, why the push for 4K and 8K? Much of it does come down to pushing new sales more than it is a practical need. HDR and newer display technologies such as OLED and laser are going to take a long while before they are available in the average £300-£500 television. Resolution, on the other hand, is something that can be increased relatively easily and included in the lower ranges, as well as having some tangible figures to market to people.
A 4K and 8K display will certainly make text look a lot better, assuming a true 4K or UHD image is shown. But do we need it? Surprisingly, probably yes. It doesn’t do any harm to have higher resolution, but it does need some serious focus on combining it with other technologies to make it truly mind blowing in the public eye.
No doubt, 4K and 8K televisions will become the default in the future. You won’t be able to purchase a television with less resolution. But, it also means that we need to be mindful of the fact that 4K and 8K may not give us the panacea of realism and detail that we crave, simply because the fineness of resolution outperforms our ability as cinematographers to focus it truly accurately enough. Next time you are the cinema, it is something to look out for.