<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

How Premiere Pro and After Effects were used to cut ‘Eno’, the first generative film

2 minute read
How Premiere Pro and After Effects were used to cut ‘Eno’, the first generative film
3:26

Brian Eno is not only a musical titan, he’s also an artist still working right at the cutting edge of music and technology even as he approaches his eightieth birthday.

The most expensive app I think I ever bought for my phone was made by Brain Eno.  Reflection cost about $40 and sequences generative ambient music components together to make sure that the same track is never played twice. It steps through the seasons with muted graphics and is a great way to relax, to meditate, or just to zone out for a bit.

It’s eight years old now, so fair to say I’ve had a good bit of value out of it. And Eno not being the sort to let the grass grow under his feet, he’s followed that work by being involved in the eponymous biopic which is billed as the world’s first generative film.

As the Adobe blog explains, director Gary Hustwit and creative technologist Brendan Dawes developed bespoke generative software that sequences a human-coded mixture of scenes, music and interviews to create a film that is never the same twice.

That presented a unique challenge for editors Maya Tippett and Marley McDonald.

“Because no one has ever done this before, there was a laundry list of technical challenges,” says Tippett. “Intellectually, from an editing standpoint, it was incredibly challenging because my co-editor Marley McDonald and I had to reorient the way we were trained to tell a story, which is linear. We had to figure out how to edit in more of a modular and systematic sense. And since there is so much randomness built into the system, Marley and I had to create many more scenes than a typical documentary, so the software had a variety of material to pull from.”

Tippet cut the film using Premiere Pro (where its Speech to Text feature was a critical part of sorting through the 500 hours of archive material) and McDonald made animations using After Effects. 

“I’ve been working with Adobe software for over a decade and I’ve cut my last few features with it,” says Tippett. “The AI transcription (Speech to Text) is a game changer for searching for specific lines in interviews or archival footage. I also used Frame.io to interface with directors/producers/clients from rough cuts to the final locked cut. It’s a great tool to give and receive notes on, which is a massive part of being an editor.”

“Creating a film that changes every time it’s played was a hugely challenging task. We had to come up with a new system for editing,” adds McDonald. “For me, the big breakthrough was realizing that we could rely on what Brian had to say about his process of creating generative art. One of the most complex scenes to cut was Brian explaining what “generative” means, but once we were able to unpack that idea; we were able to cultivate a deeper understanding of what we needed to do to create the film.”

The movie premiered at Sundance in 2024 and is currently still on the festival circuit. However if you want to experience it yourself, it is in the middle of live streaming weekend with three performances of six left to go, two today at 6am and 10pm EST and one tomorrow at 3pm EST. Streams costs $12 to access. Head here for details. And yes, there was also a season pass available so you could watch it six times and see a different movie every time.

Tags: Post & VFX Adobe Premiere Pro

Comments