Back

Virtual production’s fantastic voyage


Steve Jarratt’s 23 August IBC 365 post begins with a really good summation of what virtual production is, for the uninitiated.

” …the combination of physical and digital assets in a real-time filmmaking environment. It ranges from motion capture stages with real-time feedback of digital doubles and virtual worlds, to the latest advances in LED staging, with camera moves synced to computer generated sets.”

An this is closely followed by the reassurance from DNEG’s Executive Producer of Virtual Production Steve Griffith (pictured, above) that,

“… it’s just a visual effects process; another tool in the filmmakers’ arsenal. In-camera compositing is really the best thing to call it.

So for a DP, we become a lighting tool. For a production designer, we become an extension of the set pieces that they create. And we can, in some ways, compress the visual effects schedule, and provide marketing and editorial with all of that material much further up ahead.”

At the moment it’s about the little wins

To those at the sharp end of movie-making who endure the process, the ability to studio execs something closer to the finished product during production to keep them satisfied that everything is as it should be cannot be overstated, even if there are still some trade-offs to be had, especially around ray tracing and live simulations of things like hair, cloth and fluids.

These are GPU-dependent and each new generation of hardware brings the possibility of doing everything on-set and in-camera closer. Then there’s what this means for review and approval, but that’s a more complicated story for another day.

Issues persist around screen resolution, camera positioning, the physical dimensions of the LED stage or volume being used and the complexity of integrating CG environments with physical objects to achieve the desired effect.

“But when the technology isn’t the limitation, then storytellers are going to be able to say: either you’re driving the story, or I’m telling the story, but the medium doesn’t really matter any more. Currently we have this nine-month production schedule, because it has to go linear from assets to rendering and compositing… At some point, that will all be one step. And you just tell your story through a particular camera or cameras.”

Quality and accessibility

For VFX supervisor and second unit director Kevin Baillie (below), improvements in game engine technology as well as the two, often separate, prongs of quality and accessibility are what’s driving virtual production take-up.

“Now, the commercially available tools are so good and so robust that you can have a team of one running a virtual production setup. You can start lean from a team size perspective, and then ramp up as needs be.”

Thanks to the way ray tracing allows shadows and reflections to be viewed and manipulated interactively, the new telling of the ‘Pinocchio’ story was ‘shot’ entirely in Unreal Engine before a camera was even removed from its case.

“Instead of having that video game-looking [imagery], like we were forced into having before this technology, it’s now like what would happen in the real world. And that’s useful from a technical perspective, because if you design a set, and you want the light to interact with it in a certain way, and it looks that way in Unreal, you can say, yeah, it’s going to work more or less that way when I build it for real. So, it’s useful for cinematographers, production designers, and so on.”

The real advantage of this form of filmmaking is that all the creative departments can get involved and see, together, what the end result is going to look like and make adjustments when something’s not right. Eventually, this will begin to completely blur the lines between production, editorial and finishing.

Where does AI fit in?

“First, it’s going to come in terms of things like AI de-noising that helps to make a fast, low quality render look better and help with interaction. But as we continue to delve further into it, these sort of experimental AI things that take a good video game-looking thing and make it look like a photograph, or other things of that nature that transform a rough visual or an input of text into a beautiful image. Those are going to start being integrated into the process as well. And I think that will result in things that we can’t even imagine right now.

“Ultimately, I think that the closer that we can get to shooting actors on a holodeck – for lack of a better word – while retaining flexibility in post… that’s the dream.”

Getting everybody onboard

A shift in production team mindset to one where decisions and commitments are made in advance and a film is shot in a very premeditated way is going to be needed, otherwise things can get messy and unpleasant. As is often the case, virtual production has been marketed, or at least perceived by some, as a silver bullet that will take all the pain away when creating VFX-heavy content. The process of educating filmmakers about how things actually work is underway and technical advances that are coming through will only make this easier.

To many, especially those who were involved at the beginning, what is happening in virtual production now feels very much like things did in the late Eighties and early Nineties, when Industrial Light & Magic and The Computer Film Company were literally inventing digital visual effects as they went along, breaking new ground, learning something new, fixing it and moving on until they found the next hurdle to overcome.

“We’re all just still figuring it out and there are so few people who have done it before and who are true experts in the field. We need more access to create the talent pool in order to really fulfil its potential. So I think access and talent pool are things that will probably make a bigger difference than anything else in the future.”

Kevin Baille


RELATED INSIGHTS