RedShark News - Video technology news and analysis

Have we really stopped thinking about video vs film?

Written by David Shapton | Nov 15, 2024 4:06:27 PM

If you’re a high-end cinematographer of a certain age, the answer is “probably not”. But for film viewers, as opposed to filmmakers, the jury is not only out but probably never coming back.

It was only a short generation ago that the question “is video as good as film” would simply not have made sense, or was at least hard-wired to the answer “no”. Film and video have, for most of their existence, evolved along approximately parallel but never until recently intersecting paths. Another way to put this is to say that for most of moving image history, you would never mistake video for film or vice versa.

And while film has, of course, evolved over time, it’s arguable that it has had nothing like the level of innovation that has been poured into video technology. (I have to say that I’m very happy for genuine film experts to pile in and disagree with me here…). The differences are not surprising. Film is a medium of permanent record, based on photo-chemical reactions, but video - in the early days - was a live medium, based on photo-electrical responses, and with a fixed number of scan lines.

Film has had the advantage over video of being somewhere between relatively and extremely high resolution until very recently. Even the earliest films had the potential for decent reproduction, but they were mainly hampered by low or uneven frame rates and jitter in the mechanisms feeding the film into the gate.

Modern digital restoration techniques have extracted remarkable detail and naturalness from the ancient film that we always assumed had to look frankly terrible, But the film doesn’t have any particular resolution in the sense that video does. It obviously doesn’t have infinite resolution - nothing apart from reality itself does - but the limits of the chemistry have a different sort of effect to the limits of video resolution. It’s arguable that the film grain, which you could characterise as randomly shaped, randomly placed “pixels”, can actually increase resolution over time through a process called dithering.

The resolution will be televised

Without dwelling on the convoluted history of early television (with mechanical vs electrical systems, and the former with a resolution of only 30 lines), video’s evolution over the last few decades has been remarkable, and before that, not much happened for around 40 years - with slightly different timescales in the US and Europe and differences within Europe, too. In the UK, 1967 saw the arrival of 625-line colour television in the form of the third national TV channel, BBC2.

This resolution was a huge improvement over the previous 405 lines and remained in use virtually unchanged until the second half of the 2000s when 16:9 Full HD burst onto our living room screen. Curiously, the BBC succeeded in implementing a kind of 16:9 widescreen on the old 625 line standard in the late ‘90s. It used an anamorphic squeeze to achieve the wider aspect ratio, ironically reducing the images’ pixel density - and introduced a new realm of confusion around square vs rectangular pixels. But it was worth it. 16:9 TV, even in standard definition, was a genuine, if slight, move in the direction of video gaining parity with film.

For decades, video was hamstrung by “broadcast” standards. There seemed little point in creating video resolutions that broadcast equipment couldn’t handle. All of that changed when computers started to be able to process video, and perhaps the first to notice this opportunity for higher resolutions was RED Digital Cinema. The RED One camera shot in 4K resolution to RED’s own RAW format. The combination of resolution, colour depth and accuracy gave filmmakers their first opportunity to shoot on video with a quality that could pass as film-like.

Breaking away from broadcast standards was revolutionary. Other manufacturers adopted 4K, and despite the first 4K TVs' eye-wateringly high prices, it was only a couple of years before virtually all TVs on sale were 4K.

The missing link

Video-as-digital-film had one missing piece: High Dynamic Range. The standard for HD video, Rec 709, only had a dynamic range of around seven stops, but HDR needs ten stops and upwards. Most digital cinema cameras can achieve 13 to 15 stops or more. Standard TVs can’t show HDR, but have slowly improved their dynamic range. Meanwhile, VOD services like Netflix include HDR in their programming. Some modern TV types are exceptionally good with HDR, with OLED probably still the favourite, with its total blacks and rich colours.

I would argue that the only justification for still using film is nostalgia, because virtually every aspect of film can be exceeded in the digital domain. Extremely high definitions (up to 17K in Blackmagic Design’s flagship cinema camera) have made most digital artefacts invisible, and if a film director does want an overt film look, they can always add it as an effect.

tl;dr

  • Historically, film and video evolved separately, with film being a medium of permanent record and video originally based on live, photo-electrical responses. This led to significant differences in resolution and quality between the two.
  • The introduction of higher resolutions, starting with 625-line television in the 1960s and progressing to 4K digital cinema cameras, allowed video to achieve quality comparable to film, breaking away from traditional broadcast standards.
  • For video to match the capabilities of film, HDR became essential. Modern digital cinema cameras can achieve significantly higher dynamic ranges than standard HD video, which is necessary for high-quality visual experiences.
  • The justification for using film is seen as nostalgic, as virtually all aspects of filmic quality can now be surpassed by digital technology, including extremely high definitions and the ability to simulate film aesthetics digitally.