There is a tectonic shift taking place in the process of making motion pictures, perhaps as significant as the introduction of sound, color, or the transition from celluloid to digital capture. The technology has been advancing at an ever increasing pace and now the pandemic is acting as a catalyst to speed up its adoption. What I’m writing excitedly about is the dawn of Virtual Production.
In 2008, just as the technology for digital acquisition was getting to the point where many considered it just barely acceptable as an alternative to shooting on film, television producers were threatened with a Screen Actors Guild strike. SAG had jurisdiction over projects shot on film, but another actors union, AFTRA, covered projects shot on electronic cameras. Not wanting to be hung out to dry with Actors walking out on their shows once they got into production, Producers signed contracts instead with AFTRA and shot all the pilots digitally that year. The strike never happened, but SAG saw the error of their ways and ultimately merged with AFTRA so that the unions could not be played against each other. However, almost overnight, virtually all of the narrative network series made the switch from film to digital.
Once the technology was proven on TV, it started to gain a growing foothold in feature production where now shooting on film is the exception rather than the rule. That’s all history now, and for better or worse, water under the bridge, but I bring it up again because I feel the current pandemic and its social distancing requirements are going to serve in a similar way to advance the use of virtual production techniques.
I wish I could be more optimistic, but considering the restriction of Covid-19, I don’t think we are going to get back to “normal” production any time soon, and possibly we never will. However, there will be a new normal and virtual production will be a big part of that landscape. As the industry searches for new safe methods for returning to production, including techniques that allow for social distancing on set and avoid large crews on location, the technology for virtual production has been advancing at a break neck pace.
Growing out of the gaming industry, which requires accelerated graphics processing to not only create, but to play games in real time, these tools can now be applied to motion picture production. The mass market demand created by gamers allowed for investment in research and resulted in much innovation, which the motion picture industry can now take advantage of.
Graphics processing units, or GPUs and Artificial Intelligence (AI) have evolved to create new possibilities for realtime rendering at very near photo-realistic quality. Ray Tracing technology has allowed computers to emulate the way light works in the real world. Instead of creating pre-designed, or “baked-in” lighting for scenes in CGI, the simulated path of the light is traced, (or rather, millions of simulated light paths.) Virtual lights can be created, and as the light bounces off objects, it moves and interacts with their properties. For example, if it bounces from a glossy green surface, its hue changes. The resulting image improvement, while it may seem subtle, is what separates fake looking CGI from that which is truly believable. This technology has been like a junior varsity player waiting on the bench for the chance to play first string, getting better and better and eager to prove itself. That time has now come.
If I may, allow me to paint a picture of what virtual production in this new normal might look like. Very small teams will be sent out to locations all over the world to capture background elements including motion plates at very high resolutions along with ambient sound. The picture elements mixed with CGI will be used to create computerized 360 degree 3D models of various environments in which foreground action can be set and the sound may be added into the final mix.
Back in the studio, skeleton size crews controlling cameras, jibs, and sound booms working remotely from monitors in separate rooms close to the set will capture the actors, perhaps even one at a time, when the scene’s coverage permits.
The background will be displayed on a series of stitched high resolution video monitors being driven by a computer gaming engine such as Unreal Engine. (The name is derived from the Epic Games first-person shooter game Unreal, first introduced back in the late 1990’s, which evolved to be successfully used in a multitude of game platforms.)
For motion picture production, the camera output, including picture, crucial realtime lens and geo-spacial metadata, is then fed into the computer to seamlessly place the actors into the background environment. When the camera moves, the background will move in sync, and likewise when the lens changes focus or zooms. The portable walls of the stitched LEDs can also be moved around and used as a light source so that the lighting perfectly matches the environment down even to reflections, and the engine can also drive motion picture LED lights for color and intensity match via DMX, all in real time.
These virtual sets are a big improvement over traditional greenscreen where the backgrounds are usually added in after the fact. However, the LED walls can instantly show a solid green background if that technology better matches the needs of the shot; (maybe the background has yet to be created.) With virtual sets, not only can the filmmakers see what they are getting in real time, but the actors can also see the environment to which they reacting. It will undoubtedly take some adjustments for Actors to feel completely comfortable with these techniques, but the ones who are not capable of adapting may see their careers fade, just as those who were not able to make the transition to talkies when sound came to motion pictures.
It will definitely not help the work prospects for professional background artists. Once they are captured, (perhaps modeled via 3D full body scanners,) they may be manipulated and duplicated at the filmmaker’s will and added into any number of future scenes. Instead of calling central casting, digital artists creating the environments will go to the stock image library of extras and select the number and appropriate period dress that the scenes call for.
This may seem like some wild prognostication, but in reality these kinds of production techniques are already in use. Our own Sam Nicholson, ASC, with his Stargate Studios, is a leading pioneer in this area, as is Lux Machina, Digital Domain, Weta Digital, and the team behind the Disney+ series The Mandalorian, which includes ILM.
Meanwhile, digital artists working in their home studios are honing their skills and learning how to create amazing virtual environments. Unlike some past technology breakthroughs, the folks leading the charge for virtual production seem to be working cooperatively to advance this filmmaking science and are openly sharing much of their knowledge to allow others to enter the field.
Epic Games, the company behind Unreal Engine recently announced that they have created paid fellowships to mentor creatives in virtual production. According to their website, “The Unreal Fellowship is a 30-day intensive blended learning experience designed to help experienced industry professionals in film, animation, and VFX learn Unreal Engine, develop a strong command of state-of-the-art virtual production tools, and foster the next generation of teams in the emerging field of real-time production.”
The Digital Cinema Society intends to closely follow these emerging production techniques that, together with the limitations of the pandemic, are causing a major metamorphosis to occur in the motion picture industry. In fact, we are in the planning stages of a documentary project to gather the latest updates to this transformative technology. Welcome to the future.