VR is something that divides opinion. But when it is combined with the Unreal games engine, it is transformed into a cutting edge film sequence visualisation tool, complete with virtual cameras. This is the future, now.
Virtual Reality in conjunction with realtime rendering of graphics using games engines have become the new must-have accessory to cutting edge production. Its use has advanced from a tool to help pre-vizualization of select action or visual effects intensive sequences to being the medium through which filmmakers collaborate in a virtual environment mixing live action; digital puppetry with CG backgrounds.
Two films this year exemplify the approach. In the wildly kinetic, ultra-cool John Wick three-quel, a climactic set-piece sequence takes place in a room high up in the Continental Hotel where ceiling, floor and interior is made of glass.
The cinematography by Dan Lausten echoes that of Roger Deakins’ work filming a glass filled room in Skyfall and recalls the house of mirrors shoot out from John Wick 2 (itself a homage to Orson Welles’ climactic scene in The Lady from Shanghai).
Just to make it even more difficult, in John Wick Chapter 3 – Parabellum there are giant LED screens playing back vibrant colours inside and outside the glass room.
“[Director Chad Stahelski] wanted this idea from the beginning, and we spent a long time talking about how to achieve it,” Laustsen told IBC, “They built the set about 800 x 400 ft in a studio. It was really complicated to light, so we shot tests with the big LED screen on the outside. When you have glass surrounding you 360-degrees you have to be very careful to avoid lights and other equipment being in picture but we had the experience of handling something similar from John Wick 2.”
What differed from the JW2 was the use of VR to prep the scene. This included a full design of a 3D version of the glass office that the key filmmakers could view in real-time in VR goggles way before the actual location was constructed as a physical set.
According to concept illustrator Alex Nice, this virtual version even included proxy fighters, and the ability to mock-fight in VR. This was all built and played back in Epic Games’ Unreal Engine 4.
The benefit of being able to virtually walk around a complex set was not just beneficial to the production design but to the stunt-fight co-ordinators and to Laustsen who was able to ‘pre-visualise’ how to shoot the scene, such as where to place the camera and how to light it, to a far more accurate and realistic degree than before.
The team even built in a virtual camera which allowed for scene capture, depth of field and lens selection within the game engine. What Lausten saw and the decisions he made while wearing the VR headsets was able to be relayed to crew by displaying it on a monitor.
This technique, however, was an exception for a film which makes a virtue of shooting as much as possible, including star Keanu Reeves’ stunts, in-camera.
Not so The Lion King, which has arguably pushed the boundaries of virtual production further than any project to date. Entirely animated and featuring singing, talking animals, Disney’s feature is designed to look as if it were shot for real as a live action.
That meant giving the filmmakers, including director Jon Favreau and cinematographer Caleb Deschanel, as close to an experience of shooting a live action movie as possible for the entirety of the production, but within a VR environment.
Once again, UE4 was the engine of choice into which highly detailed CG landscapes (designed from real Kenyan safari vistas) and pre-built character animation had been ported.
The filmmakers did everything from location scout the CG panoramas, blocking the scene with virtual and real actors (whose performance was shot on a stage then translated into animation by facility MPC) to selecting camera angles, shooting multiple takes for coverage and modifying lighting.
Making the shots feel real was all about emulation. The production created physical representations of traditional gear because Favreau believed it would help the film feel like it was photographed, rather than made with a computer. There was an emulated Steadicam rig and an emulated handheld camera rig. There were cranes and dollies. There was even a virtual helicopter, operated by Favreau himself.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” Favreau explains in the film’s production notes. “Even though the sensor is the size of a hockey puck, we built it onto a real dolly and a real dolly track. And we have a real dolly grip pushing it that is then interacting with Caleb [Deschanel], who is working real wheels that encode that data and move the camera in virtual space. There are a lot of little idiosyncrasies that occur that you would never have the wherewithal to include in a digital shot.”
The Lion King can claim to be the first time filmmakers walked around as if on a real set – in sync and in real-time using VR – communicating to each other, pointing things out and manipulating things together.
Francesco Giordana, Realtime Software Architect at MPC calls it “a real milestone to be able to put multiple people into the same space at the same time collaborating this way – where new multi-user workflows meet old school cinematography and filmmaking.”
The significance of this approach, is not just the ability to see and manipulate the virtual in realtime, but the marriage of digital with the analogue or conventional film grammar down to mimicking the tactility of actual camera equipment.
“When we're in VR, it gives you the visceral feeling of being there,” says three-time Oscar winning Rob Legato who is the film’s overall VFX Supervisor. “The whole concept of virtual production and virtual cinematography is about imparting your analogue live choices and not the ones that we have to think about for a long period of time. So, you want to have something that you have instant feedback on. If you line up a camera and something moves in the background, you might change your composition based on live input – that you react to immediately without having to think about it.
“The closer the technology gets to imitating real life, the closer it comes to real life. It's kind of magical when you see it go from one state to another and it just leaps off the screen.”
The film’s virtual production extended techniques developed on The Jungle Book. Where Avatar broke ground by giving the filmmakers a window on the VFX world — they could see the CG environment in real time during production as if they were looking at it through the camera’s viewfinder — The Lion King inverts that idea by putting the filmmakers and their gear inside a game engine that renders the world of the film.
Physical devices were custom built, and traditional cinema gear was modified to allow filmmakers to ‘touch’ their equipment— cameras, cranes, dollies — while in VR to let them use the skills they’ve built up for decades on live-action sets. They don’t have to point at a computer monitor over an operator’s shoulder anymore.
“That was all hugely valuable to the stunt-vis team,” adds Nice, “because then they can really get a sense of things like the stairs and where walls were, and where things were obscured. It actually becomes a really helpful storytelling tool and planning tool way ahead of time. The sooner that you have these guys being able to spatially map out this environment allowed them to do their magic for the film.”
About half-way through the project, “I heard the cinematographer mention that it would be nice to have a camera in the virtual set to see what the framing would be,” recounts Nice. “So I employed a programmer through an outsourcing website, and together over the span of a few months, me and my partner Matan Abel developed a tool in Unreal that let you walk around within the scene and then hit a button on the controller, which was basically a camera that let you switch between the lenses. It was similar to walking around and seeing it on the monitor, but you were actually in VR holding the camera.”