We'll never see a 1.21 gigawatt lightbulb, but here's Phil Rhodes to explain why the relationship between batteries and their potential output for lighting isn't always straightforward.
Being able to run most of a film shoot from batteries is great. Even if you eventually plug things into a wall, the ability to walk a light in and set it up in a matter of seconds, without messing about with sockets and cables, is a massive time saver. Now we’re starting to see more powerful LED lights at trade shows, things like Aputure’s COB-600 and the Hive Super Hornet, it’s clear that battery-powered lights are edging ever closer to displacing HMI from being the key light in covering at least small scenes. It’s pretty clear where this is going.
Hang on, though. Did we just describe lights in the 600 watt range as battery-powered? Well, sure; the COB-600 has been shown with a power supply (albeit not final) with four V-mount plates on it. Even with four, they’ll need to be big, capable batteries and those are not a low-cost proposition. The truth is that a 600-watt LED is just as difficult to power as a 600-watt (well, 575-watt) HMI. There’s some small offset in the efficiency of the driver electronics, but in the end, nobody routinely expects 575W HMIs to be run from camera batteries in the way that’s expected of LEDs.
If you’re familiar with the way that watts, amps and volts are related, the technical realities of all this will be no surprise. Otherwise, read on. All this applies to mains power too (though the risks are, of course, larger), so it’s good to know.
Batteries are usually specified in watt-hours, so it’s tempting to think that a 150 watt-hour battery could run a 150-watt load for an hour. However, things become complicated quickly. Batteries invariably have a maximum load which is often around the same as their capacity or a little below it. Actually applying a 150W load to a 150Wh battery is a dicey proposition. An Anton Bauer Titon 150 has a capacity of (about) 150 watt-hours and can sustain a maximum load of ten amps. But hang on, we’ve been talking about watts, so far. How does this ten-amp load figure?
Limits are often specified in amps because fuses and circuit breakers are usually specified in amps, but to make sense of all this, we need to understand the relationship between watts, volts and amps. It’s possible to find a 12-volt headlight bulb from a car which consumes 50 watts. It’s also possible to have a 110 or 240-volt mains lightbulb that consumes 50 watts. Both consume the same overall amount of power and probably output roughly the same amount of light, but the lower-voltage bulb will draw much more current, measured in amps. The total amount of energy used over a given period of time is the same in either case; it’s just delivered differently.
The most common analogy used to explain this is water in a pipe. A given load (our lightbulb) uses a given amount of water (electricity) over a given period of time. Say it uses a gallon an hour. We can use a narrow pipe and force the water through at high pressure, or we can use a wider pipe and let the water flow more slowly. Either way, we get a gallon an hour. In this analogy, water pressure is the voltage, in volts, and the width of the pipe is current, in amps. It’s a good way to think about it because to get the same amount of water (number of watts) at a lower pressure (voltage), we need higher current, and higher current means thicker cables. That’s why cross-country power lines are run at very high voltage – so that very thin cables can be used.
The mathematics involves a simple multiplication. Watts equals volts multiplied by amps, or, to put it another way, the total amount of water equals pressure times pipe diameter. Our 150 watt-hour battery can supply a maximum current of 10 amps and it has a voltage (depending slightly on how full it is) of about 14.4 volts. Watts equals volts multiplied by amps, so 10 × 14.4 = 144, so we can draw up to 144 watts from the battery – roughly enough to empty it in an hour.
That’s why we can’t run a 600-watt light from a single 14.4V battery. We would need to draw 43 amps of current since 43 multiplied by 14.4 is roughly 600. This is also why some very power-hungry cameras use 24V power. The total amount of power required by the camera, in watts, is fixed. But increase the voltage and the current decreases, so that the load is easier for the battery to bear.
That’s also why big loads, such as our 600-watt light, tend to require several batteries connected in series, positive terminal to negative terminal, which increases the voltage. Lithium-ion camera batteries are usually four cells in series to begin with, so if we take four batteries and wire them in series, we end up with sixteen cells in series and a total voltage a bit less than 60V. For a 600-watt load, we would then need around ten amps. The load needs to be designed for this, but it makes life easier for the batteries.
In practice, it’s worth building in some safety margin – perhaps 25% - to avoid blowing fuses and because batteries work more effectively when lightly loaded. 150W is still a big load for most 14.4V camera batteries. Whether we’ll see even bigger lights, with even bigger arrays of batteries, seems a bit of a stretch, but to some extent, it’s always been possible to do that with big inverters that generate mains power from battery power. There are also companies such as Cine Power International who make the big block batteries that sit on dollies and can supply both the amps and the volts to achieve an awful lot of watts.