by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the February 2020 Digital Cinema Society eNewsletter)
In a DCS tradition maintained for many years, I travel each February out to the California desert near Palm Springs to attend and report on the HPA Tech Retreat. It is a very high level gathering, now in its 25th year, that brings together a core group of technology leaders to share with each other challenges, solutions, and innovations in motion picture production and post.
There were a few areas of concentration that I found of particular interest this year. One theme revolved around advancements in virtual production technology. Another report looked at the perceptual differences of 8K content compared to 4K. However, what really got my attention this year involved a full-on motion picture production with a major portion created in near-real time, from acquisition through post, with a screening right there at the retreat.
Let’s start with an area of great interest to me, the broad range of computer-aided filmmaking methods known as Virtual Production. Growing out of the gaming industry, which requires accelerated graphics processing to not only create, but to play games in real time, these tools can enhance creativity and save time compared to traditional linear pipelines. The motion picture industry was the beneficiary of the mass market demand created by gamers, which allowed for investment in research and resulted in much innovation in terms of processing speed and graphic quality.
Now that this technology is starting to be applied to motion picture production, companies like DigitalFilm Tree, Lightcraft Technology, Lux Machina and Stargate Studios are using the “Unity” or “Unreal” engine, (first designed by Epic games for their first person shooter game, “Unreal”) for VFX. DigitalFilm Tree has used the technology to create virtual worlds where filmmakers can pre-visualize their scenes that can be as photo realistic as they want to allow for. This can range from simple cartoon avatars in a 2D space to fully rendered 360 degree CGI backgrounds derived from the Production Designer’s schematics, pre-photographed background plates, or even Google Maps. With the addition of “LiDAR,” technology, (short for light detection and ranging,) a virtual camera can be moved around in this 3D space to precisely show various angles of view and depth of field based on the data provided. Filmmakers can block and plan their shots down to what lenses and camera placement they want to use.
Lightcraft Technology, the brain child of a longtime DCS member and MIT graduate, Eliot Mack, is a system that offers real-time camera tracking, keying and compositing for visual effects. A camera can shoot against greenscreen, for example, while showing the filmmakers the composited image in real-time. Geometric orientation, and spacial location details along with lens metadata are fed into a computer that moves the virtual background in relation to the live action. I’ve used this amazing system myself on a TV pilot and since I love the freedom of movement that a jib arm provides, it was a no-brainer for me. I was able to zoom, focus, and do any manner of live, physical camera movement, even handheld, and the background followed in lockstep. When the foreground focus changed and or the camera zoomed, the depth of field and angle of view in the background images also followed in realtime.
Another longtime DCS member presenting at the Tech Retreat was Sam Nicholson, ASC. Sam is the CEO of Stargate Studios, who has been Visual Effects Supervisor on productions from Star Trek: The Motion Picture to The Walking Dead, and Ray Donovan. He uses Unreal Engine technology along with camera/lens metadata to drive photorealistic CGI or photographed background plates displayed on high-resolution monitors. The subject is then photographed in realtime against that background as an alternative to green/blue screen. The canvas need be no bigger than the display positioned behind the talent, but the background’s perspective can be computationally manipulated to create huge worlds that move and can interact with the camera to create virtual movement.
Lux Machina has also developed in-camera VFX techniques utilizing Unreal Engine to create organic, realistic virtual sets including reflections with immersive LED or projected environments. Both Lux Machina and Stargate Studios participated in the making of the film that was largely created right at the Tech Retreat, (much more on that to come.)
I also want to report on a presentation by Warner Bros. VP of Technology, Michael Zink, on the perceptual differences between 4K and 8K originated content. It confirmed what I have long believed; in short, that it is hard to tell the difference. In side-by-side, blind comparisons, over a hundred test subjects watched randomly mixed short clips that were native 8K, or 4K derived from the 8K. They were seated five and nine feet from an 88-inch consumer 8K OLED TV viewing test material that included Tick, Brave, A Bug’s Life, 8K film scans of Dunkirk,, 8K animation and uncompressed 8K RED Digital footage.
According to Zink, “Test results showed that increasing resolution from 4K to 8K, under typical viewing conditions, did not result in significantly improved visual difference. We also learned that perceptual difference is highly content dependent.” That is not to say that there are no benefits to acquiring at a higher resolution than the one you are finishing for. These include stabilization and reframing options, but the viewer can’t really discern the difference of an 8K mastered image compared to 4K even under ideal viewing conditions.
That will not stop the 2020 Tokyo Olympics from being broadcast in 8K, (at least in Japan). With the support of the Japanese government, NHK has developed a complete end-to-end 8K broadcast pipeline including its own 8K camera and Satellite systems. It is a novel science experiment, and I give them credit for pushing technological boundaries, but I really hope it does not soon become a mandate to shoot and finish in 8K; (Netflix, please don’t get any ideas.)
Speaking of experiments, now let me tell you about a very novel one dreamed up by Joachim “JZ” Zell. JZ is VP Technology for Deluxe/EFILM, Vice-chair of the motion picture academy’s ACES Project, an HPA board member and chair of this year’s Tech Retreat Supersession. I should also mention that JZ is an organizer of the “HBA,” the Hollywood Beer Alliance, a regularly scheduled, but less formal get-together where motion picture production and post professionals meet for lunch and beer drinking to socialize and share technology war stories at a local Hollywood pub.
The Supersession is a tradition at the HPA Tech Retreat where a full day of presentations are grouped around a single unifying technology theme. Past Supersessions have covered everything from VR to “snowflake” (no two alike) workflows. This year JZ, wearing his many, many hats, decided to produce a short film with much of it shot and posted in realtime during this year’s Supersession. It was designed to be a very collaborative project bringing many disciplines and vendors together to work toward a common goal, really not that much different than many modern film and TV productions.
It was shot on cameras including the Sony VENICE, ARRI ALEXA, RED MONSTRO, Panavision DXL, Blackmagic URSA 4.6K G2 and Blackmagic Pocket Cinema Camera with lenses from Zeiss, Sigma, and Panavision, HDR monitoring care of Canon, wireless via Teradek, and lighting from Rosco, Kino Flo, and Dedolight. As mentioned, Virtual Production services were provided by Stargate Studios, Lux Machina, and Epic Games’ Unreal Engine, with Mark Bender Aerials providing a couple of nice drone shots.
Post production brought even more companies into the mix including Avid, Blackmagic’s Resolve, BeBop, Colorfront, Filmlight, Skywalker Sound, Ownzones, Red Bee, Pomfort, Sohonet, the Foundry’s Nuke, and both Amazon Web Services and Microsoft Cloud. EFILM, JZ’s day job, lent their support, as did SIM Digital, where Supersession co-chair Paul Chapman is VP of Engineering and Technology. However, it was Frame.io that was the glue holding the whole project together, connecting virtually every participant and every company with their platform, including RAW files, dailies, conform, and edits, with Michael Cioni serving as the “Cloud Master,” (did I just make up a new industry job title?).
JZ used his charm and considerable HBA connections to put together a stellar crew including four ASC cinematographers with Steven Shaw serving as director, Roy Wagner as 1st Unit DP, Peter Moss, operating camera and Sam Nicholson serving as the VFX Supervisor. Barry Goch and Timothy Schultz served as editors, with Steve Morris and Bill Rudolph of Skywalker Sound, Joe DiGenerro of the Academy’s Sci-Tech Council, and Loren Simons of RED among the many other production and post industry experts called into service.
In real world filmmaking, it should be story that drives the technology, but in this case technology was in the driver’s seat to create a proof of concept using an ACES color pipeline to bring together so many disparate cameras and formats including 4K, 6K and 8K, and shifting between SDR and HDR in a cloud workflow with collaborators simultaneously working in L.A., Northern California, and Palm Desert.
Since story was not the focus, JZ, who is German and a beer aficionado, had a little fun with the premise. Instead of a script, during an on-line virtual production meeting, he created an animatic storyboard with locations that were available to him including his favorite pub and the ASC Clubhouse. The story grew from there with input from the many collaborators and the volunteer Actors making up their own dialogue.
Thus “Lost Lederhosen.” was born and production was started in earnest, first at the Clubhouse, then to Jameson’s Irish Pub, and on to the Lux Machina stage where a Bavarian forest served as the background for a German beer garden. The premise that evolved had JZ’s German Beer Garden buddies traveling to America to be reunited for a grand finale that was shot in front of some 700 Tech Retreat attendees who also served as extras, toasting with beers in hand while singing german drinking songs; (you had to be there.)
Earlier that same day, Sam Nicholson set up his ThruView system on the presentation stage in front of the conference attendees to create a scene that simulated being on a moving train. It was a three camera shoot integrating a previously photographed moving background plate tied together using Unreal Engine, thus eliminating the need for green-screen. Besides being locked to the Unreal Engine, the RED, ARRI, and Blackmagic cameras were also fed into a Resolve grading system so they could be color corrected with ACES IDTs and CDLs applied as metadata while they were being recorded. It was quite a complex scene to be shot “live,” but after several takes, mostly for performance, they pulled it off without a hitch. They were literally on and off the stage in total of only about 20 minutes.
Of course, this could all have been done in a more traditional fashion, shooting greenscreen and creating the composites later in post. However, it would have taken a lot longer and not afforded the Director the visual feedback and confidence to know he had it in the can. The beauty of this kind of system, as explained later by Sam Nicholson, ASC is that it brings the post production process forward into preproduction and production, vastly shortening the time to complete once the cameras cut. In this case, it was only a few hours until the entire film, including this elaborate VFX scene, was projected for the audience.
Also playing a part in compressing the post process was the cloud collaboration technology managed by Frame.io. Literally as soon as the cameras cut, the shots were being uploaded to the cloud via Teradek wireless where the editor, working back in L.A., and the Post Sound team, working up at the Skywalker Ranch, were able to get right to work. Meanwhile, a VFX artist was adding a sign on a building while chatting on FaceTime and sharing her desktop with JZ, while projected on the big screen so the audience could follow along. Final credits and a behind-the-scenes video were also being simultaneously created and the Director and DP were supervising the final color grade in the back of the room even as the editor was polishing the cut.
The demonstrated benefit of working in the cloud is that while the content resides in one central location, numerous collaborators can access it at the same time with any number of applications, from editorial, VFX, sound, and color. As soon as it is locked, it can be turned over to another service that handles the versioning and distribution, which in this case was managed by Red Bee Media.
The amazing thing is that it all worked. The movie will not be in contention for any Best Picture awards, but perhaps it should be considered for a Sci-Tech Award. It proved many points and it was highly educational to follow the process. I give not only JZ, but the whole team, and the HPA, a lot of credit for taking the chance and flying without a net. I can’t wait to see what they dream up for the 2021 Tech Retreat. In the meantime, let’s raise a glass to the HPA.
The finished production is not currently available to view, but to view a hehind-the-scenes video visit:
For a more in-depth description of how Digitalfilm Tree is applying Virtual Production technology, see Ramy Katrib’s interview from the 2019 DCS Post Expo:
0 Comments