Replay: No matter how much we try to evaluate cameras in terms of hard numbers, things like dynamic range, resolution and noise floor, people will still ignore them and go with what looks nicest, which isn't necessarily the same thing. What's more, the very idea that any modern camera actually has a fixed look is fairly dubious in itself.
People often say certain cameras have desaturated highlights, or crushed shadows, or dingy midtones, or good (or bad) skin tones, but really they don't. They can be processed to have those things, and yes, different cameras have different practical limits to how much processing can go on. But to glance at a monitor and make a snap judgment about what a camera is capable of is dubious in the extreme, and not just because that monitor is right underneath a nasty metal halide light in an exhibition hall.
Decisions, decisions. The input during grading, here at Canon's demonstration facility in Burbank, California, is often larger than the objectively-detectable differences between cameras
When we talk about what a camera's pictures look like, we're really talking about what the pictures look like when put through the manufacturer's Rec. 709 LUT and displayed on a Rec. 709 monitor, or perhaps, these days, what they look like displayed in HDR, on a wide colour gamut display, using the manufacturer's recommended settings. Those are (usually) the pictures potential users will associate with the camera; that's what's shown on a trade show floor. The problem is, it's then easy for everyone to get an idea, even subconsciously, that what they've seen using the default options is somehow authoritative.
The grade on Man of Steel attracted attention for its desaturated nature (though some video reviews overemphasised that for comic effect.) Everyone has an opinion
What the camera actually shot, if we look at the raw sensor data on a display expecting any common video format, looks very dark, very contrasty and very green. What it recorded looks very flat and grey. What we monitored might be something else again. Even these ideas are based on the idea that the monitor is expecting signals formatted in one way, while the data we're talking about is formatted in another; the camera isn't really generating dark green pictures, or grey pictures, it's generating data that happens to look that way if we put it on a 709 monitor. It's theoretically possible to put a LUT into a monitor to (approximately) normalise dark green raw pictures, if you could somehow get that data on an SDI cable ,and for some value of “normalise.” It's a matter of what we expect.
Naturally, most practical applications for movie cameras don't do strange things like this, but it serves to illustrate a point: what the camera really sees is not even really a picture until we've made decisions about noise, sensitivity, and the brightness and dynamic range of the display. Any time we look at the output of a camera – any time – a lot of those decisions have been made by someone with an opinion. So, is it actually valid to wander around, say, NAB, and look at monitors, and form an opinion of our own? Can we assume that there's some sort of overriding truth to what we're seeing; can we assume that's what the pictures really look like, as shot, regardless of what we do later?
This is a frame from Traffic, which embraces blue. Possibly the finer characteristics of the camera and film stock are buried under the adventurous colour treatment
Sort of. If a manufacturer is showing something at a trade show, or demonstrating it to a journalist, we are entitled to assume that it's properly set up and that what we're seeing is a reasonable example of its capabilities. We can probably assume that the configuration will not have been chosen to unduly crush blacks, clip highlights, turn red things purple, and so on. What we'd hope is that the default configuration, the configuration that's most often demonstrated, at least demonstrates the limits, the maximums, the most we can get away with.
But it isn't always that way, and that sort of thing can lead us to limit our thinking about what's possible. At least one major camera manufacturer has been accused of shipping a default Rec .709 LUT that makes midtones rather too dark. Firmware upgrades have repeatedly added new brightness and colour encodings to cameras which can make them look very different. This is bad for consistency and makes life complex when we try to set up monitoring, but it also frees us to reconsider what a proper picture looks like, and that's good. Modern cameras don't really have a built-in look, and we shouldn't be limited by the idea that they do.
Well, they do. They have default LUTs. But those LUTs just represent one more opinion.
This was shot on an FS7. What we see here is the result of Sony's LC709A LUT, which is reputedly an attempt to ape Alexa. It has not been corrected for your sRGB display
In some ways none of this is really good. It was always nice to use a film stock or camera system and learn how it worked; to become good at it, to know how it reacted to various situations. Even with testing – which many productions don't do well, if at all – it's difficult to get the depth of experience we could latterly enjoy having used a piece of technology for year after year, or even for decade after decade. Yes, there are detectable, observable differences between cameras, but that's not what makes the most difference anymore. There is no one, canonical way to view what modern cameras produce. On the upside, that's flexibility. On the downside, it's inconsistency and a doorway to an unlearnable infinity of options. The trick, then, is making the best of that compromise – and trying not to make assumptions about what a particular piece of tech can do.
Tags: Production
Comments