Phil Rhodes goes poking into the nooks and crannies of the LVCC and comes up with some interesting new tech at NAB 2024.
This year there were, according to the organisation’s own numbers, around 1300 exhibitors at NAB. As such, some of the most interesting, most useful things are inevitable some of the least discussed, often tucked away in the corners of a vast event hall or overlooked for their sheer ubiquity.
Consider Matthews Studio Equipment, for instance, which celebrated its fiftieth anniversary this year and exhibited in Central Hall alongside all the other camera, grip and electrical equipment. Matthews’ products are so commonly-encountered they’re practically invisible, even when they’re towering a dozen feet overhead. Matthews is very good at putting things on sticks, albeit very clever sticks with high capacity and easy handling which can put things in a wide variety of places - and this matters, especially for productions that don’t have the option to rent a construction crane or drill holes in the set it’s built on a vast sound stage.
Matthews’ 2024 exhibit saw the debut of its Middlemax Menace Arm (above). Debate over whether the portmanteau “middlemax” is an oxymoron aside, the purpose of the Middlemax is much as any other menace arm - to boom lights far, far out over a scene, creating the effect of overhead rigging without actually having to do any overhead rigging, which can be time-consuming, expensive, and often either impossible or simply not permitted by a sensitive location.
There were already large and small arms, but the new midrange option is designed to offer the maximum possible reach while still fitting into a modestly-sized van. Nearby, Matthews also showed its Grip Rail system, designed to replace the conventional aluminium tubing that’s generally used with Speed Rail clamps. The Grip Rail rods can be extended or retracted, and therefore offer a somewhat more reusable and sustainable solution than endlessly cutting up pieces of extruded tube.
On an entirely different subject, over in South Hall, we find SwitchLight. The term “relighting” has been used perhaps a bit too freely in the context of early-2000s grading systems capable of as many as three soft-edged windows. SwitchLight can analyse one clip - shot with no special measures - and relight its subject according to the fall of illumination in an HDRI environment.
This is reminiscent of the work done on Lightstage, which illuminated its subject - often a person - from a large variety of directions using a carefully-set-up structured light source, operating at high frame rate. The result was a normal map of the subject - that is, an image where the red, green and blue values represent which way the surface of the subject is facing at that point in the image. It makes a degree of relighting possible. Switchlight appears to do something rather similar, but it derives the normal map using artificial intelligence.
As with many AI applications, its developers appear to have had the vast and incredibly lucrative smartphone market most prominently in mind, but it’ll be interesting to see whether this, or something like it, might turn up in grading software at some point. All the usual concerns about the livelihoods of film crew apply here, but it’s still impressive. Like a lot of applications of AI, it’s not perfect - human subjects can suffer a slightly injection-moulded look. The feasibility of improving AI the little bit more that’s required for this sort of thing to be perfect is a wider issue than we can reasonably address here, but Switchlight is another example of what’s being worked on.
As we’ve discussed a lot recently, there’s a feeling that virtual production is no longer in its most nascent stage. Central Hall is no longer a maze made out of giant video walls (though there are a few) and the discipline, in general, seems increasingly normalised in a way that’s probably quite healthy in the context of the explosive growth of the last few years. One of the least-expensive options to dare assume the title has to be Skyglass. It’s a system - well, more a single app, really - which leverages the hugely integrated capabilities of an iPad, in terms of its camera, accelerometers and other hardware - to make near-zero-setup VP possible.
The striking image which heads this article was, regrettably, shot off the iPad’s screen, and suffers some quality shortfall on that basis. Still, with a simple bluescreen and a couple of actors in costume, one iPad on a motorised slider was doing a fairly convincing job of transporting us to an alien world. Naturally, we’re not looking for the sort of results that appear in a high-end feature film, here. Your correspondent is at best a rank amateur compositor, but it was clear the bluescreen extraction could have done with just a pixel or two of edge erosion. Still, if we’re in a situation where the bluescreen extraction is the biggest problem with a fully motion tracked virtual production setup, we truly have reached the promised land.
There’ll always be hundreds of fascinating exhibits like these at a big trade show and it’s a point of regret that we can’t possibly cover them all. But in the end, that’s why 61,000 people went to NAB this year. And there's always room for one more picture...