In a tongue-in-cheek turn, we extol the virtues of a lighting standard that has fallen out of vogue in modern productions.
Lighting technology has changed a lot since the dawn of filmmaking, but the drive for certain characteristics has remained almost throughout. Early efforts at things like Cooper-Hewitt lamps, a mercury discharge technology, emitted very greenish light. These instruments were targets for improvement, a journey we're effectively still on with the development of much better metal halide sources, such as HMI, as well as fluorescent, and now LED. All of them have had problems with colour rendering, flicker and the inability even to switch on after being used for a while, as well as their cost, weight and bulk. Other things, like the ability to produce either soft or hard light, are also factors.
There is, however, another contender, one that I intend to make a case for that's very rarely made.
Our contender is an imperfect technology. We'll deal with the downsides at the end, but overall it looks very good. Particularly, it has colour rendering that's about as near to sunlight as you can get, although it has rather lower colour temperature than sunlight. The difficult deep-blue R12 measurement, which LEDs often struggle with because it's often deeper than their blue diodes' output, is exactly where it should be. There are no problematic holes or spikes in the output spectrum to trip photographers up when trying to shoot difficult, highly-saturated subjects. Perhaps most significantly, all the world's lighting filter manufacturers have ranges expressly designed for use with this technology, ensuring predictable, reliable repeatability that often isn't available when trying to gel, say, fluorescents.
It looks good in the lab, too. CRI and TLCI are both excellent, in the very high nineties, and there's practically no warm-up time. Furthermore, different examples tend to match in terms of their colour output in a way that LED, HMI and fluorescent can sometimes struggle to achieve. Something as simple as a different run-time between one HMI bulb and another can cause mismatches, whereas the technology we're discussing tends to do exactly the same thing from day one to the end of its life, whereupon it's cheap and easy to replace.
This technology is almost completely flicker free. Very low-powered examples can, in some circumstances, produce flicker that can affect high-speed shooting, but in general there's no need for special drive electronics to achieve non-flickering light. It doesn't produce problems when shooting at any frame rate or even with cameras from 50Hz localities shooting in 60Hz places, or vice versa. HMI can sometimes achieve flicker-free operation, although not absolutely always, depending on the target frame rate and the capabilities of the ballast. Many HMI ballasts have flicker-free switches and many of them start making an irritating acoustic noise in flicker-free mode, but there's no need for any of this with the technology we're talking about. There's also immediate restart, hot or cold, without need for specialist or expensive drive electronics.
and there are good alternatives if power isn't an issue
Lacking a need for external ballasts, the technology we're talking about is less bulky and lighter than HMI or LED of similar output, except perhaps at the very smallest scales. It doesn't require cooling fans, except in a few very particular circumstances, and there's no problem with operation in very hot or cold environments, as there sometimes can be with fluorescent. Perhaps most crucially, though, it's available in an enormous range of sizes and types, from under a watt up to tens of kilowatts. Arrays of small, low-powered examples can produce convincing soft light and the largest versions still produce a sufficiently concentrated spot of light that can easily be used to drive PAR or fresnel lights.
The only problem with the technology we're discussing is that it isn't terribly efficient. It's far from unusable, being capable of around 25% the output of an HMI, LED or fluorescent tube of similar power consumption. Almost all of the energy not turned into light is emitted as heat and this can lead to a need for increased ventilation of a location or studio. Also, there is some shift in colour temperature when dimming, although this is sometimes useful as attractive, warmish tones can be made available by creative use of this characteristic. Dimming, incidentally, can be achieved with simple, easily-available equipment and is smooth all the way down to black.
and they're great, but they do go purple when dimmed.
Finally, and perhaps most significantly, this technology is extremely cheap. We get all of these things: perfect colour rendering, zero flicker, zero noise, dimming, light weight and small size. And it's probably the cheapest lighting technology available. And it is extremely available, from places as straightforward as the average DIY stores. Examples at the half-kilowatt level can be purchased for trivial amounts of money, and their output will match the largest movie lights in terms of colour precision. It is, in short, an absolutely spectacular technology, with only one drawback and an absolute shopping list of highly desirable features.
Now, what was I talking about, again?