Back

Displaying Unreal virtual production smarts on Disney’s The Mandalorian


IBC 365 reports on the groundbreaking virtual production technology used on Disney+ Star Wars spinoff The Mandalorian, reducing the space required for filming, the elimination of the need for location shooting and involving real-time, in-camera compositing, giant LED walls displaying dynamic digital sets, a motion tracking system from Profile Studios and Epic Games’ Unreal Engine. 

Challenging the industry norm on modern sci-fi production, most of the work on The Mandalorian was done in pre-production and on physical sets, albeit much smaller high-tech versions, rather than in post. About 40% of the sets were shot by roving camera crews and then displayed onto a 20-foot high, 270-degree semicircular LED video wall with ceiling and a 75 foot in-diameter performance space, called the Volume, where reflection and refraction of light from the panels bounces off surfaces and behaves as if it were being shot for real on location.. 

When the camera was moved inside that space, the production crew and cast were able to react to and manipulate the digital content in real time, taking advantage of the change of perspective offered by the virtual environment.

The real-time LED screens were driven by Unreal Engine running on four synchronised PCs. Three Unreal operators simultaneously manipulated the virtual scene, lighting, and effects on the walls. The crew inside the LED volume were also able to control the scene remotely from a tablet, working side by side with the director and DoP..

View the original story on the IBC 365 website for more detail and see this video interview with VFX supervisor Rob Legato about how NVIDIA GPUs helped shape the real-time production of the new Lion King Film, which is another recent project from John Favreau who is also at the helm of The Mandalorian.



RELATED INSIGHTS