DIT stop. At anything other than the very top end of dramatic filmmaking, are digital imaging technicans really needed for most productions?
A long time ago, if we wanted to know whether the shot was adequately framed, sharp and whether the microphone had wandered into frame, we asked the only person who could accurately see the picture: the camera operator, the person with an eye to the viewfinder. When video cameras became small and relatively affordable, of course, it became possible (with the caveat of dimming the operator's view fractionally) for everyone on set to look at a soft, flickery, monochrome image. This image might contain, if we were lucky, some vaguely-visible scratch marks indicating the outline of the recorded image. Almost immediately, many people from the director down retreated from immediate interaction with the cast and crew to one of those stylish pop-up chairs behind a wall of monitors.
Add to this situation a few decades' worth of technological advancement and we're in a situation where people can wander around with tablet computers and smartphones, transcoding, grading and viewing material to their hearts' content in a way that would have required whole buildings full of people to achieve for most of the twentieth century. A whole new specialism has arisen, that of the digital imaging technician (DIT) and a new truckload of equipment (albeit mainly made from commodity information technology products) for the DIT to operate.
The job that is now referred to by that acronym arises, if we consider recent history, from several vectors. The first DITs were probably the people responsible for adjusting the colorimetry settings of cameras like Sony's F900, possibly the first product to see widespread adoption as a digital cinematography device. This was done because the comparatively narrow dynamic range of the cameras, particularly their eight-bit, heavily-compressed recordings, demanded a more sophisticated approach on set than most directors of photography had time to implement on their own.
Another proto-DIT task was something that would probably be considered the work of a digital loader, back in that narrow period of time in the mid-2000s when really high-quality recordings required a refrigerator-sized block of hard disks and some sort of (frequently magtape-based) long-term storage solution. Throughout all this, DITs were frequently responsible for all the things they do now: calibration and configuration of displays, extraction of stills and even a bit of transcoding here and there.
One rather more controversial job that's been key to digital cinematography practitioners of all ranks, ever since the field became practical, has been managing the concerns of production staff. From the very moment the F900 went into use in feature filmmaking, digital cinematography was capable of vastly more reliability than 35mm photochemical film, simply because HDCAM tapes could be cloned via SDTI, whereas it isn't possible to make an identical duplicate of an analogue recording.
Fear of the unknown is understandable in a circumstance where perhaps nine-figure production budgets rest upon the ability of the technology to deliver, although the film industry's discomfort with information technology was probably not rooted in any real, technical concern, at least from the risk-of-failure point of view. That rather irrational concern persists to this day, when production technology can be held to a level of reliability vastly in excess of that which film was ever capable. Nevertheless, part of the job of a DIT remains the management of both fears and expectations.
Concern over the creeping integration of traditionally post production-oriented tasks into the front end of production work is not particularly new. Completist directors of photography might currently involve both a DIT and a digital loader in the camera department, and there's no question that the facilities thus provided are useful. There will always be a need for someone to offload material, and there's absolutely nothing wrong, in theory, with having the ability to audition grades and thus validate camera and lighting decisions on set. The ability to really see what's going on, using high-grade monitors showing a close approximation of the director of photography's intent, is overwhelmingly a good thing.
Indeed, all of this is good within limits. Those limits are defined, though, by the fact that additional facilities inherently imply additional complexity. Filmmaking, at anything other than the very highest of the high end, is almost ludicrously time-pressured. The fact that a cameraman with better monitoring can potentially make better decisions, leading to a better-looking show, must be balanced against the reality of a given number of setups to achieve per day. The approach of those nine-figure productions is not a model for everyone here; certainly, the overwhelming majority of filmmaking is not done at a level which would permit the director of photography enough time to make grading decisions on set (or at least to do so with a reasonable degree of consideration).
This is true regardless of whether we consider the much wider range of tasks a DIT might currently be expected to perform. For instance, it's increasingly common for offline versions of files to be created on set, perhaps taking a raw or high-bitrate camera original and converting it to a lower-bandwidth version for the edit, possibly superimposing burn-ins or predefined grades on the way. It's hard to argue that this is really an essential part of the on-set workflow; it can be done elsewhere or later. Indeed, if our goal is to minimise the complexity of the already hopelessly-complex atmosphere of a film set, such tasks should be done later.
Objections to the idea of deferring that work revolve around the fact that post production facilities frequently charge rather eyebrow-raising rates for file conversion work which would, in any other area of IT, be the sort of work given to an intern on a low-cost workstation. Excessive cautiousness about the real level of complexity involved here is costing production companies money. As to when and where the work is done...well, sure, editorial assistants may misunderstand the processes involved in producing offline files with LUTs and burn-ins, but that's a competence problem which could just as well affect a DIT. The edit suite is still a vastly better place to do that sort of work.
The complexities introduced by these highly comprehensive on-set workflows were, for a while, a barrier to the adoption of digital cinematography. Even now, it's common to hear defences of film on the basis that the extra crew, equipment and post production services required for digital origination introduce undue cost and complexity. The reality, of course, is that no extra crew or equipment or extra services are actually required. Some of it may be desirable, but none of it is formally necessary and other alterations to the workflow, such as selecting a monitoring LUT in a camera, can reasonably be considered part of the first assistant's job.
These thoughts are likely to provoke howls of protest from both DITs and the people who are convinced they're required. Senior camera personnel, perhaps particularly the most experienced people whose formative experiences predate digital acquisition entirely, can feel dependent on a competent DIT to ensure nobody is alarmed by an odd-looking monitor image. And, at a suitably high level, this may be fine. It's also not my intention to cost anyone any work. Most of the work that DITs do certainly has to be done, just perhaps not on a magliner in the corner of the soundstage. At anything other than the top end of dramatic filmmaking, though, it's a mistake to assume that a whole extra member of the camera department, with three Macbooks and a brace of OLED displays, is an absolute necessity.
It's not.