With AI available to all kinds of creative professionals, what will it mean for filmmakers now - and in the distant future?
I'm all in favour of spell checkers. As a writer and editor, I'm as prone to typos as anyone, and I do get the occasional spelling wrong. A spell checker is nothing but an extra layer of checking, and I can't see any obvious drawbacks.
Recently, I signed up for Grammarly. It's an AI-based cloud service that does a pretty good job of checking your grammar, almost in real-time, and with a surprising degree of sophistication. I often disagree with its recommendations, but that's mainly for stylistic reasons: I sometimes deliberately break the rules it's trying to enforce. It's OK to do that because I'm not writing a will or a procurement contract.
The application often feels that something isn't right in my sentences or paragraphs, but it won't be specific. Instead, it defaults to the safe option of "consider rewriting the sentence". Frequently, it will say, "the sentence is written in the passive voice. Consider rewriting in an active voice".
But very occasionally, it takes my breath away by not only telling me I've written a paragraph that's too complicated for a "knowledgable" reader but by rewriting it completely. When that happens, I'm more than happy to take its advice.
As a writer, it's natural for me to have a wary approach to this type of tool. Possibly even an adversarial one, too. No one likes being told they're wrong, but sometimes the advice is spot on, and it would be petulant to ignore it.
Does it affect my style? I hesitate to say this, but, yes, I think it makes it a bit better. It's all too easy to get stuck into an idea when you're writing and lose sight of the fact that someone else will be reading it. I like to think that this doesn't lead to egregiously bad prose, but it absolutely can be improved. I've found that rejecting Grammarly's suggestions often triggers me to think of a better line anyway.
All of this is to preface a much broader discussion about what role AI might play in film production, and by "production", I mean everything from plot ideation to workflow optimisation.
And wider still is the question of "will it still be art?" if the majority of your work is "helped" by AI. Essentially, whose idea will it be? Who takes the credit?
These are big questions that deserve big answers. There isn't space here to do justice to these ideas, but we can at least put them into some kind of perspective.
Let's dive right in and talk about originality. Will AI dilute our ability to write something that no one has penned before? The answer depends on the specific role of AI in the process. And what this boils down to is: "Is the AI a tool - like a hammer - or is it an artistic collaborator?
To answer this, we have to look at how humans work together. If I decided to write a book by myself but found that I was discussing my ideas with a friend and that that person was giving me their ideas to use alongside my own, then I think you would have to describe them as an artistic collaborator. If the ideas (and the work) were shared equally, you might want to call them a co-author.
I'm sure that Grammarly isn't going to help me have the concepts I need to write a screenplay. Not even close. But it might help me get my work into a better state before I show it to anyone. It will spot mistakes and, sometimes, make it read better. Poetry would confuse it, as would most types of dialogue.
But it would learn. An AI screenwriting assistant might start with all the grammar rules, and then it would learn idiomatic speech. What type of "learning"? AI learns by being shown examples. Millions of them.
It would know how people talk. Eventually, it would learn why people talk; why they say things. This would raise the question of whether it has started to understand people. Understanding people is definitely a prerequisite for being creative.
Could an AI that's learned to be creative by reading thousands of creative works (or watching films) be original if its ideas come from observations of other people's works?
I'm tempted to say that it could be. Because that's precisely how we do it.
So, where does all this lead to? Software can "learn" almost anything, at any scale and complexity. AI is improving faster than any technology that has ever existed.
A few years ago, I wrote that eventually, you'll be able to feed a script into a computer, and the output will be a fully finished feature film. Of course, we can't do that now, and to most people, the idea will seem daft. And it's true: if we can ever do that, it seems a very long way off.
But look around companies like Nvidia and the AI companies like Google's DeepMind, and all the other extraordinary developments that we read about daily. Look at the ambition, the talent (and the money!) behind the metaverse. Look at the games engines like Unity and Unreal Engine. Look at how almost every technology is converging on the unified and shared 3D space that will be the direct successor to the internet. Look at the nearly infinite resources of the cloud.
Even taking all of that into account, the idea of having a script generate a finished feature film still seems a long way off.
But, somehow, it also seems several thousand times closer.