Director Peter Hyoguchi’s new film Gods Of Mars, which is currently in production, is the latest to take full advantage of the power and flexibility of virtual production technology,. Unreal Engine and virtual set technology have enabled the production team to develop what was intended as a small-scale adventure story into something much more ambitious.
“I was able to expand the story, set pieces and action while keeping the budget low. We see this kind of technology changing the scope of filmmaking and especially film budgeting in ways to create movies better, faster and cheaper.”Jonathan Schriber, screenwriter
Full Variety interview with Peter Hyoguchi below
What does it mean to fully embrace virtual production – what is different about that than what we’ve seen on The Mandalorian or the photo-real The Lion King?
In the Hollywood landscape, The Mandalorian utilized Unreal Engine to produce “Final Pixel” imagery that was directly photographed in-camera and in most instances, did not require any additional work to go directly into the edit.
Final Pixel is what the audience sees. The final shot. So, for instance, The Lion King used the Unreal Engine only to plan out their shots but Final Pixel shots were all done redone using traditional VFX software that takes days to render one single frame of Final Pixel images.
Gods of Mars is the first Hollywood production to use real-time rendering of game engine animation/images as finished Final Pixel shots. This is done by our cross-medium creative team that merges game programmers with traditional matte painters and miniature makers with Hollywood pedigrees. To push the Unreal Engine to meet the high-quality standards equal to current VFX/animation software.
Why are you at the stage where you can embrace it for Gods Of Mars?
Technically, it wasn’t possible before. Very recently, Epic Games’ Unreal Engine has evolved from a video game creation software platform into a photo-real cinematic tool which is both open source and free. We were able to cross the uncanny valley of reality by creating, scanning and importing live-action practical miniatures into the Unreal Engine.
As a director, working with the Unreal Engine/LED volume is extremely close to the experience of directing actors on location or on set where we can all see and interact with the environment. Directing a scene with green screens is tedious and awkward because everything has to be imagined and post-production is expensive, time-consuming and never as good as capturing everything in camera like traditional production. I love the freedom and intuitive nature of virtual production.
What does this mean for the future of production? How did it save time and money?
For production, zero company moves provide incredible savings — travel, insurance, permits, housing, etc. No need for building full sets also cuts massive spending from the budget. Not shooting with green screens cuts out massive post-production VFX compositing fees.
All told, this new visual reality approach is running only about 30% of the costs of a typical CGI heavy production. That is a huge potential saving on a project that normally could cost north of $200 million.
And, how do virtual productions help with COVID guidelines and safety on set?
“Bubbling up” is made possible because an entire movie can be shot on one LED sound stage. Smaller crews also make a COVID-safe environment. Virtual production can cut cast and crew sizes by half so not only is it safer, it’s faster and more efficient than traditional production models.