by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the February 2022 Digital Cinema Society eNewsletter)
It’s Great To Be Back – DCS Explores the 2022 HPA Tech Retreat
In a DCS tradition maintained for many years, (with the exception of 2021 which was held virtually due to pandemic restrictions), I have traveled each February out to the California desert near Palm Springs to attend and report on the HPA Tech Retreat. It is a very high level gathering, now in its 27th year, that brings together a core group of technology leaders to share with each other challenges, solutions, and innovations in motion picture production and post. It allows me a forward looking chance to recognize and analyze future trends that will affect our members and the industry at large.
There were several areas of concentration that I found of particular interest this year including Virtual Production, the integration of AI and Cloud technologies, and the use of motion picture cameras in live production. There was also an unintended theme that kept occurring to me, the increasingly heavy influence of gaming on our visual language which seems to permeate many other forms of entertainment.
It would be hard to beat the practical demonstration of virtual production and cloud workflows put together for the 2020 Tech Retreat. “Lost Lederhosen,” a full-on high production value motion picture was finished in near-real time, after shooting an elaborate scene set on a moving train right inside the conference room and featuring the entire audience as extras. However, that didn’t stop this year’s organizers from trying to top it, and perhaps they did.
There was not just one, but three separate and complete virtual production volumes constructed inside the main conference room, combining technology from most of the major manufacturers in the industry. Cameras were supplied by ARRI, Sony, and RED with volumes by AOTO, Planar, and Sony fed by both Unity and Unreal gaming engines. But the Tech Retreat is about a lot more than just eye candy; it’s the presentations by experts in the field and panel discussions that are the most valuable. So, allow me to run through some highlights from the four days of content.
The first day’s session was curated by DCS member Mark Chiolis and explored the use of cine cameras in multi-camera live production. It was a deep dive into the challenges and benefits of using larger single sensor cameras in a world that was once dominated by 2/3” broadcast cameras and lenses. Separate panels discussed the production of concerts, award shows, and the 2022 Super Bowl game telecast. They had members of the team responsible for this year’s game telecast as well as another panel on the impressive high-production-value half-time show. Camera and lens manufacturers shared their perspectives, and Bill Bennett, ASC described his process in producing the last live ASC awards using ARRI S35 cameras. Overall, it was demonstrated that although there are some obstacles to overcome, they are able to successfully integrate cinema cameras and techniques into such productions.
It was during a presentation by Mike Davies, Chairman of the Fox Sports Video Group, that first got me thinking about the growing influence of video games in how we visually tell our stories. He is the man responsible for their coverage of major sporting events including multiple Super Bowls, World Cups, Baseball World Series, NBA championships, and NASCAR. He made the point that covering a sporting event is basically telling a story, much like a motion picture narrative, and the way audiences expect those stories to be visually presented is getting evermore sophisticated.
Davies credits video games with setting a new standard which live sports coverage is trying hard to emulate. He gave football as an example, where long lens cameras high atop the stadium and perhaps a few down at field level were once adequate to cover the event, or tell the story if you will. However, an audience accustomed to playing Madden NFL is no longer satisfied being on the sidelines; they want to be in the action.
Audiences are used to a cinematic aesthetic in terms of depth of field and framing, a high bar to meet shooting a live event, but technology is helping them get there. Skycams, drones, large sensor cameras isolating the subject with shallow depth of field, wireless cameras and microphones on the athletes; these are just some of the tools being utilized to bring the audience into the action. But the intent to make this coverage look like movies, and to a greater extent, like video games, is what I found revelatory, especially being that day two of the Tech Retreat delved into Virtual Production, a technology literally spawned from the gaming industry.
The broad range of computer-aided filmmaking methods variously known as Extended Reality, Mixed Reality, or Virtual Production have benefitted from the accelerated graphics processing needed to not only create, but to play games in real time. The mass market demand created by gamers allowed for investment in research and resulted in much innovation in terms of processing speed and graphics quality. But it is not just the technology that has been borrowed from gaming, the whole visual language used to tell our stories is now heavily influenced by this medium. A representative from one of the three LED walls on display shared with me that the Chinatown background volume used in their demo was lifted directly from a popular video game. It looked GREAT, extremely photorealistic, and it was hard to believe it was computer generated.
The current top box office release is Sony Pictures’ Uncharted which is based on a series of action-adventure games published by Sony Interactive Entertainment for PlayStation consoles. The movie lifts sequences straight from the games and Sony has already indicated that the planned Uncharted 2 movie will take on more video game elements in terms of action, adventure, and storytelling. At a time leading up to the Academy Awards, when moviegoers are eager to catch up with nominated movies, such as Power of the Dog, Coda, King Richard, etc. these critically acclaimed titles are experiencing lackluster box office performance. In fact, the Academy declined to include any of this year’s top 10 domestic-grossing films among their nominees. More than any other factor, box office success is what will drive future productions. So, for better or worse, in a risk adverse industry, you can expect to see more tentpole effects laden offerings modeled on video games, (and, of course, sequels based on comic books). This will only increase such visual influences on motion picture aesthetics. This is not an analysis that was part of any Tech Retreat presentation, but just something that occurred to me while exploring the various technologies.
Getting back to what was presented, let’s look at some of the other technologies and issues that were discussed. It seems one of the benefits of working with virtual production volumes is that the LED panels can be used not only for backgrounds, but also as a light source, offering the bonus of realistic location-based reflections. However, like early LED luminaires offered up for motion picture lighting, there are gaps in the spectrum of light currently emitted by the volumes’ video walls.
The light produced by the LED panels that make up the volumes may be acceptable for a background image, or when reflected off the helmet of the Mandalorian, but can leave some skin tones lacking, and matching colors with scenes shot in the real world can be problematic. A presentation by Quasar Science Color Engineer Tim Kang and Netflix Color Scientist and USC Professor Paul Debevec outlined some of these issues. In another presentation, a team including Steven Poster, ASC, and our own Jim Defillippis, among others, offered a possible solution. They propose adding additional colors of LED chips to the volume’s light engine arrays, just as motion picture LED lighting manufacturers have learned to add extra colors in addition to RGB to fill in their spectrum.
Meanwhile, ARRI showed their Skypanel and Obiter units being driven by a game engine to provide color correct light in sync with color and intensity of the volume’s LED walls. Although ARRI doesn’t make displays or game engines, they have established a new business unit called the Global Solutions Group bringing together their camera, lighting, and lensing expertise to consult in the building of mixed reality environments. A demo volume was shown in partnership with AOTO, Mo-Sys, and B&H.
One of the presentations dealt with authorship and control of the final image as well as interoperability between systems. Another dealt with the need for a standard terminology and introduced the VP Glossary, an industry-wide effort to establish a common vocabulary across professionals working in Virtual Production. Created by the Visual Effects Society (VES), the American Society of Cinematographers (ASC), and a host of industry experts, with support from Epic Games and Netflix, it is freely available online here: https://www.vpglossary.com/
A presentation that really caught my attention was titled the Future of Synthetic Beings in Digital Entertainment by VFX Producer Tom Thudiyanplackal. It explored what is possible with synthetic beings which can be controlled by mo-cap or as background characters with preprogrammed wireframes. A literal army of extras can be easily created and their clothing/uniforms and coloring can be quickly swapped out. What would be a multimillion dollar epic battle scene can now be easily and convincingly staged in front of an LED wall with a small fraction of the resources, then easily edited and reused for the next epic scene. It was also interesting and a little scary to note that research has shown highly rendered synthetic beings are more trusted than actually human actors. These so called, “digital humans” may soon be used to pitch products in commercials or even as reporters to read us the news headlines.
AI-Assisted Color Pipelines was the subject of a panel moderated by ICG Local 600’s Michael Chambliss that featured Lawrence Sher, ASC, DIT Dane Brehm, Colorist Mark Todd Osborne, and Dado Valentic of Color Intelligence. They made the point that AI can be used to do some of the more mundane color timing chores, such as matching disparate cameras in order to leave more time for the Colorist to concentrate on the creative work of shaping the final images.
Case studies are always an important part of the Tech Retreat and this year there were not only reports on a Virtual Production created by the Entertainment Technology Center at USC, but also on a new StEM, (Standard Evaluation Material), created by the ASC which also incorporated virtual production techniques and just about every other new motion picture technology. Most of us will recognize the Italian Wedding scene that was produced by the DCI, (Digital Cinema Initiative,) back in 2004, (lightyears back in “tech time”). It was created under the auspices of the ASC Technology Committee to provide standardized test material for evaluating the performance of digital projectors and other elements of the systems at the dawn of digital cinema.
The futuristic StEM2 has the goal of including technology that was hardly even dreamed of in 2004. Key contributors included Writer/Director Jay Holben, DP, Christopher Probst, ASC, VFX Supervisor David Stump, ASC, 2nd Unit Director and DP Steven Shaw, ASC, and Producers and Post Supervisors Wendy Aylsworth and Joachim “JZ” Zell, who with the exception of Probst, were all on the panel. The impressive action packed 17-minute film, called The Mission, was specifically designed to incorporate challenging cinematographic material and will soon be openly available to the entire entertainment industry as a common reference to evaluate our imaging chains.
Another case study given by HPA President and Light Iron CEO Seth Hallen, along with Light Iron’s Katie Fellion, and Adobe’s Michael Cioni looked at cloud post-production technology on the feature Biosphere and the ACES team of Annie Chang, JZ Zell, and Alex Forsythe gave an update on their tireless work establishing the interoperability standard.
Perhaps it is because I spent some time several decades ago making a documentary on the Navajo reservation and have more recently been teaching underserved youth as part of the InnerCity Filmmakers program, but I really appreciated finding out about a project sponsored by Blackmagic Design and AWS. Native American high school and college students from Arizona reservations were given a little bit of training and some of the latest gear, then set loose to make their own film. They made a beautiful little drama, partially in English, that poignantly made the case for preserving their native language and culture. It was inspiring to see what underserved young filmmakers could come up with if given access to the proper tools.
Besides the formal presentations and demos, time was reserved in a trade show style environment known as the Innovation Zone. Manufacturers were assigned booths and it was a great way to get some one-on-one time to really dig in and find out about their new products. Of course, there were also the Breakfast Roundtables, another opportunity to get face to face, (unmasked when eating breakfast), with some of the top technology minds in the industry. I was able to get a highly detailed run down of the extremely versatile Riedel Bolero system from Rick Seegull and am eager to try it on a future production.
Speaking of masking and covid protocols, I have to give the HPA a lot of credit for creating a covid-safe environment. They required all attendees to prove that they were fully vaccinated and tested negative in the hours leading up to their arrival at the Tech Retreat. In addition, self administered Covid test kits were handed out at check-in for attendees to test and report daily that they remained Covid negative. Such protocols were a bit of an inconvenience, but it was well worth it for the peace of mind provided especially when gathering so many people from all over the world. They proved that such an event can be staged safely and it can serve as a model for future industry events until this pandemic has completely passed.
Onward to NAB, Cine Gear Expo, in-person DCS events, and looking forward to the HPA Tech Retreat 2023.
(Images of the Tech Retreat courtesy of the HPA and photographed by Rand Larson. Uncharted game and movie collage from ScreenRant.)
0 Comments