Virtual Production: How It Will Affect You
The pandemic has supercharged virtual production.But, is this new technology for everyone or just Hollywood-budgeted movies?
Virtual production is a catch-all phrase for all production with a virtual or CGI-based element. You could have explained it recently by saying anything that isn’t seen through the lens. But now, with virtual set technology, even that’s not true as live VFX is becoming the norm.
But, virtual production isn’t something to be frightened of. It’s actually a rich new avenue for careers in film production. There are now courses at schools like Met Film in London. Just Google “virtual production jobs” and see for yourself what’s available.
The Future of Filmmaking
DOP Eben Bolter on the LED volume stage at Rebellion Film Studios in Oxford, UK.
None other than Oscar-winning cinematographer Sir Roger Deakins thinks that in ten to twenty years, movie-making will be mostly made of CGI-based humans acting within a virtual world, and you won’t be able to tell the difference. Everything else will be made traditionally with real actors within real sets.
Now, with the help of game engines and their unique way of manipulating polygons, VFX creation of backgrounds is being made before a frame is even shot.
Under the title Virtual Production, you can add volumetric capture, performance capture, virtual camera outputs like master wheels, pan, and tilt heads, and even the feel of a handheld camera.
As a result of game engines, it’s now possible to affect lighting and feature different lenses from drop-down menus, perfect for cinematographers.
But, let’s look at virtual sets or LED volumes. We can see their beginnings in films like Disney’s Tomorrowland or Tom Cruise‘s Oblivion, where back-projection techniques brought high resolution to the backdrops.
DOP Claudio Miranda found that actual footage and not green screens enabled a more reflective or “excellent” production design, which suited the actors too as they could see what they were acting against.
Oblivion used back-projected high-resolution footage shot in Hawaii on RED cameras, but there was no camera tracking to coordinate camera and background as there is now.
For these two films, the back projection was a sign of what was to come. That back projection was a good idea, but only answered a few of the problems of on-set realism. The shots weren’t linked to any camera tracking, so they didn’t move as the camera moved, giving them an unreal or flat aesthetic.
DOP Greig Fraser on Rogue One: A Star Wars Story took effects nearer to realism by using LED screens as a selective backdrop to help scenes like the Millennial Falcon going into light-speed. He also realized that he was using the screens as a light source, so he added them to his lighting design.
But, it was when he started work on The Mandalorian he was able to knit the screens with the camera using Unreal Engine.
Epic’s gaming engine provided the ability to render the footage in near-enough realtime as it does for video games. Camera tracking then matched that footage with the camera achieving a parallax view to make what we see make sense.
The Race for LED Volumes
Under the vast umbrella of Virtual Production, the LED volume set has been the huge success story of the last two years. There’s a land grab to implement such volumes into studios—even camera manufacturers are getting in on the act.
ARRI has built a 708 square meters volume in the UK with a curved 30x5m video wall. The space is rented out to productions in more of a studio way, with room for hair and makeup, wardrobe, crew dining, and a mezzanine level for viewing the action.
Even traditional studio-hired specialists see an opportunity for a LED volume. AMP Studios in Dallas is planning one as soon as possible, and sees it as a company goal.
Garden Studios, a new studio complex in London, added a LED volume studio to their raft of services, calling it, ”. . . the natural successor to green screen and one of the film’s greatest technological innovations.”
The LED volume side of virtual production is already a billion-dollar business, but projections suggest it being near a $3 billion one by 2028.
To fulfill Roger Deakins‘ vision of the future of filmmaking, other facets of virtual production are speeding ahead with their sophistication. Volumetrics isn’t the same as LED volumes—they are more to do with creating human avatars in three-dimensional space. They are also found in other areas of movie making, like fashion retail for instance, or sports demonstrations.
Volumetric shooting uses a huge amount of cameras for capture and for depth information. These cameras need matched lenses and lighting that doesn’t get in the way so they become flat by nature.
You then have to edit and, perhaps more importantly, encode the vast amount of data into an adaptive streaming model so everyone can watch it with every kind of internet service. Technology like LiDAR is now helping with depth mapping.
Dimension Studios, Nikon, and MMRC recently took volumetric capture on the road with their Polymotion Stage in a truck (see the video below). This type of capture concentrates more on recording a 360˚ digital human, a performance capture that you can then do anything with.
Recently Anayi, a Japanese fashion retailer, used volumetric capture on their website to demonstrate their clothes. From your browser, you pick the clothes you want a model to wear, the model then walks around for you to show how the clothes look when moving. You can zoom in with high resolution or come out to see a more general view.
The pandemic has brought this kind of remote fashion buying into focus, and technology has filled the vacuum of the real fitting rooms.
Volumetric capture is also growing immensely and, over the last year, managed a 45% growth rate. There are now eighty different studios in forty-five different cities. Some will also offer a LED volume virtual set service.
Who Gets Final Control of These Virtual Worlds?
When the pandemic first hit, big media studios rushed to virtualize all of their production. Game engines allowed them to construct worlds and then shoot them virtually without much location scouting or international flights for their stars—add a flare, change that lens, add more subtle lighting. Cinematographers feared that their skills were being automated and with good reason.
A production constructed with a game engine is effectively never picture locked. You can go back and change everything. You can even re-lens scenes if you need to. So, while these new technologies are bringing excellent benefits, there’s a note of caution about the level of control and when a production is over.
It makes sense for the videographer or content creator to monitor this new world of filmmaking. It’s incredibly expensive to enter, but learning Unreal Engine might be a good start.
In this introduction to virtual production, we haven’t mentioned the Metaverse, which is the next big pastime for us and maybe the easiest and cheapest way to produce in VR.
Sundance recently had their first documentary made entirely in VR, called We Met in Virtual Reality. See it below.
The Metaverse will be the new wild west of creativity, and big media corporations will want to align with content owners and creators to build their version. Sony has already announced its intention to work with big soccer clubs to develop their worlds for them.
Your second life could be waiting for you.