Phil Rhodes pokes around the margins of the NAB 2022 show floor and unearths two gems on the one booth dealing with camera tracking and grain simulation respectively.
Quite a lot of the most fascinating little boutiques of film and TV technology are companies which employ just a few people, creating a minimum of bureaucratic friction between the idea and the show floor. Such is the case with the small area occupied at the NAB Show 2022 by InviziPro, a company offering what looks to be one of the best grain simulation tools around and a system designed to create laser tracking markers that are invisible to the camera.
What do those two things have to do with one another? Nothing, other than that someone perceived they were useful and nobody else got in the way of them becoming a reality.
InviziTrak is a system that takes a lot of small lasers - the same sort of scale as a laser pointer - and uses them to create trackable features on a green screen (there’s no sign of a blue one for blue screens, but blue solid state lasers are expensive and the green ones might work fine, anyway). That’s not a new idea; people have been clamping laser pens in magic arms for years and the idea of a camera synchronised marker has been discussed in the past, particularly by Steadicam operators working on 35mm film where a spot could be made visible only in the video tap camera. In the same way, the beauty of InviziTrak is that there’s camera synchronisation to ensure it’s only visible on alternate frames.
The idea is that the taking camera shoots at double the desired frame rate with a 360-degree shutter, resulting in a take made out of odd (or even) frames with no markers visible, and another made out of even (or odd) frames with plenty of readily-trackable markers. The resulting takes have normal motion rendering; it’s not quite as convenient as being able to look at a video tap on a film camera, but it’s a very close digital equivalent (and it gets recorded). It’s something that might need a bit of attention from the relevant VFX software to work really well, but with only minor workarounds it’d probably work pretty well in more or less every piece of software that already exists.
Every laser emitter in each box of nine is individually articulated and there are diffraction grating options to allow them to project crosses, grids of dots, and other features; what’s important, in the end, is that a pattern of recognisable, non-repeating features is created on the trackable surface. One of the beauties of this approach is that it could be used to put tracking markers not only on green screens - as is clearly the intent - but that it can make more or less anything trackable, including distant and inaccessible objects.
One pace away on the same booth is InviziGrain, a very, very careful implementation of grain simulation. It stands out perhaps most starkly in contrast to the rather elementary attempts that have been made by scanning film and superimposing it over a digital image. InviziGrain does base its work, we’re told, on real film scans, but it’s a lot more sophisticated than a simple overlay. It passes the rule of thumb test for film grain: make the grain big enough, and it should disturb sharp edges in the image into a wobbly line, which a simple overlay of scanned grain as a multiplicative composite doesn’t. InviziGrain does.
Naturally, there’s also some manipulation of colour, contrast and a little blooming, although some of that might be better thought of as simulating the optical components associated with film, as opposed to the film itself. That, though, would be an ecumenical matter, and an effect just as enthusiastically engaged by competing grunge simulators such as Dehancer. Whose approach we like best is the very definition of a matter of opinion, but your narrator’s opinion is firmly inclined in the direction of InviziGrain.
As our coverage of NAB 2022, and the show itself, draws to a close, it’s worth a moment’s reflection on exactly this sort of discovery. Yes, we’re all aware that Zeiss has launched a 15mm Supreme Prime, a hugely impressive piece of engineering from a big company whose position on the global stage all but guarantees wall to wall coverage. Yes, we know about virtual production; it’s big enough news to leave traces in the archaeological record, and that’s not in any sense to disrespect the work and capability involved. But that’s the sort of thing we can find out using all of our finest remote-attendance technology. Finding the sleeper hits is a serendipitous process and not something that can be encapsulated in an ICS file.