Everything Everywhere All at Once + Avatar — James Mathers Reports on The 2023 HPA Tech Retreat

by | Mar 2, 2023 | Essays | 0 comments

Everything Everywhere All at Once + Avatar at The 2023 HPA Tech Retreat

JM Headshot2014Med
by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the February 2023 Digital Cinema Society eNewsletter)


In a DCS tradition maintained for many years, (with the exception of 2021, which was held virtually due to pandemic restrictions), I have traveled each February out to the California desert near Palm Springs to attend and report on the HPA Tech Retreat.  It is a very high level gathering, now in its 28th year, that brings together a core group of technology leaders to share with each other challenges, solutions, and innovations in motion picture production and post.  It allows me a forward looking chance to recognize and analyze future trends that will affect our members and the industry at large. When I say in the title of this essay, “Everything Everywhere All at Once,” I’m not referring to the popular movie, but instead to the demands of modern day production and post. Service providers are challenged to deliver more content in more formats, at higher quality levels, all with faster turnarounds than ever before. Forget the adage, “You can have it fast, you can have it good, or you can have it cheap: pick two.” These days, they want it all: faster, cheaper, and better, and much of the discussion at the Tech Retreat dealt with these challenges and how to meet them.

On the other hand, when I refer to “Avatar,” I’m talking about James Cameron’s latest epic, Avatar: The Way of Water. There was an excellent full day session digging into the behind-the-scenes of the production and post of the movie. In addition, there were a lot of great presentations before and after the full day Super Session on Avatar. The four day long retreat began with “TR-X”, a focused exploration of topics on everything from integration of AI and Cloud technologies, to the ever-increasing demands of live production. Chaired by Mobile TV Group’s Mark Chiolis and Crafty Apes’ Craig German, the lineup included speakers from leading technology providers, academic institutions, and industry organizations. A standout presentation for me was delivered by Dr. Khizer Khaderi of the Stanford University School of Medicine’s Vision Performance Center. He discussed various aspects of human perception gleaned from his work studying gaming that come into play when implementing AI and Immersive Experiences including AR, MxR, and VR. He explored how AI can potentially enhance workflow and provide a personalized experience for interfacing teams and machines together.

A number of other presentations throughout the day dealt with the current talent deficit in media technology and what might be done to remedy it. Even as the Tech Industry labor market tightens, there will still be a need in media and entertainment for job seekers with the necessary skill set required to master technologies such as AI and Virtual Production in order to take on roles in production and post. And not just computer Coders, there will also be a great need for those who can apply the tech to modern workflows. For example, Cinematographer Kathryn Brillhart, ASC outlined a plethora of new jobs that have recently been created in the burgeoning field of Virtual Production.

In order to help meet industry demand, the HPA has developed the Young Entertainment Professionals (YEP) program designed to accelerate participants’ careers via robust networking, mentoring, and industry educational resources. YEP class members are paired with advanced career mentors and today over 100 have graduated the program. In fact the Colorist for Avatar: The Way of Water, who was featured during the Supersession, had gone through the program several years ago.


I have written previously about my admiration for Avatar: The Way of Water. I believe it represents a monumental achievement in motion picture technology that has set a new benchmark in visual storytelling. Working in 3D, High Dynamic Range at 48 frames per second, these pioneering filmmakers developed whole new techniques such as Underwater Performance Capture, Real-time Depth Mapping and Compositing, all in a CGI water environment. To see in great detail how they pulled it off was highly educational.

In order to present this content at the Tech Retreat, a massive ballroom at the conference center was transformed into a custom purpose-built theater, featuring two 4K, 3D, high frame rate, RGB laser projectors as well as conventional Xenon projectors. This allowed the audience numbering over 750 to really discern the differences in various techniques. For example, they played clips back-to-back in both 48 and 24 fps, 3D and flat, so you really got to see why and how Cameron and his team focused on delivering such an immersive experience.

HPA Supersession Chair Loren Nielsen, who in her day job is an industry technology analyst and VP of Content & Strategy at Xperi Corp, put together an amazing set of panels. The first included my old friend Russell Carpenter, ASC, the movie’s DP, (via live remote), and in the room, 3D Camera Systems Workflow Engineer Robin Charters, Simon Marsh, representing the camera manufacturer Sony, the previously mentioned graduate of the HPA YEP program, Tashi Trieu, Colorist, and moderated by Joachim (“JZ”,) Zell, an HPA Board Member and Head of HDR Content Workflow at Barco. They laid out the development of the camera systems and workflow.

It is hard to relay all the great details shared in a short article, but here are a few of the things that stuck out. The movie was actually shot and edited several times. The first was with the actors on a motion capture stage to volumetrically record their performance’s physical and facial movements. Meanwhile, VFX artists were busy creating CGI volumes to place the characters into. Cameron would then go through these with a virtual camera and frame shots that would be edited together into a scene. Think of this as a motion storyboard of the highest order. All the while, the CGI team time was busy finessing the environments and had a blue print to start with in order to concentrate their efforts on the desired angles. Actors, and sometimes their doubles, were then photographed above and below the water in 3D HDR 48fps mostly with the Sony Venice cameras.

The cameras were set up in the newly developed Rialto system at the request of James Cameron. This involves separating the sensor block from the rest of the camera head in order to create a light weight rig that Cameron, who likes to serve as camera operator, was able to comfortably handhold. They were able to keep the camera to around 30lbs, which was quite a feat considering that it was actually two camera heads mounted in a 3D rig with two zoom lenses, and separate depth mapping cameras. The Fujinon MK series zooms were chosen in spite of the fact that they only cover the S35 portion of the Venice’s full frame sensor. This was not only due to their resolving power, but even more so, for their relatively light weight and fast aperture at only 2.2lbs and T2.9. Shooting S35 also offered more depth of field, which is not a bad thing when shooting 3D, and Carpenter’s decision to capture at 3200 ISO also helped in this regard, although he said that was largely because he aesthetically preferred the look of the Venice at the higher ISO. Of course they did a lot of testing, starting way back in 2013.

The team worked on and off for several years, with principal photography starting in 2017 at L.A.’s Manhattan Beach Studios then moving to Wellington, New Zealand and ending in late 2020. Sets included huge water tanks, one 32 feet deep holding 90,000 gallons, and another with mechanical wave and current generation. They completed most of the principal photography for two features, driven to get material in the can before one of their actors, teenager Jack Champion, who was playing Spider, could grow any older on camera.

Although Russell told me he doesn’t feel this way, I was a bit disappointed that his work did not get more recognition during this year’s awards season. There was a lot of competition this year, but I felt his work at least deserved an Oscar or ASC Best Cinematography nomination. I don’t think other filmmakers appreciate the level of skill and talent necessary to integrate live action into the CGI environment in such a seamless and immersive way.

He briefly shared a couple of his tricks such as using moving rock ’n roll type lights over the sets to simulate dappled light coming through the dense foliage or into the water. Another trick he used to get realistic reflections on the surface of the water surrounding the actors was to place LED walls at the correct angle behind the set to play back volumetric data such as flames to avoid the nightmare of trying to composite the actors as the water lapped around them.

Another crew member whose outsized contribution was largely overlooked was that of 3D Camera Systems Workflow Engineer Robin Charters whose previous credits include similar roles on X-Men: Apolcalypse, Life of Pi, and Anita: Battle Angel. The son of renowned ASC Cinematographer Rodney Charters, Robin’s contribution to the development and management of the extremely complex camera system and production workflow deserves a lot of credit. I don’t believe there are any Academy Award categories for his position, but if there were, he should get an Oscar statue. He is truly a DIT Rock Star.

Loren Nielsen and Leon Silverman, (a past HPA President,) filled out the afternoon with panels detailing the studio’s perspective and challenges in delivering such a complex project to the screen with over 1,000 unique delivery versions. This included combinations of 2D, 3D, HDR, 4K, varying light levels, aspect ratios, a high frame rate of 48 frames per second, a range of audio formats, 51 languages supported with subtitles and 28 languages supported by dubbing. It was an unprecedented and herculean effort carried out by an army of industry service providers. To wrap up the Tech Retreat’s coverage of Avatar on the following day, The Hollywood Reporter’s Carolyn Giardina also interviewed Producer Jon Landau appearing from New Zealand on the large theater screens via a web connection. As James Cameron’s longtime collaborator, he explained how he has worked tirelessly to bring Cameron’s unique creative and technical vision to the screen.

However, the four day HPA Tech Retreat was far from over.

I can’t cover it all in detail here, but some of the other standout presentations included an inspiring keynote address delivered by Jeff Rosica, CEO of Avid, and Mark Schubin’s famous Technology Year in Review with interesting highlights and humorous tidbits from the past year. Other panels delved into the industry’s effort to go green, and the complex, international, and collaborative VFX post pipeline for the series The Rings of Power.

A series of presentations delved into the “MovieLabs 2030 Vision”, a follow-on to a 2019 whitepaper called “The Evolution of Media Creation,” which laid out a bold 10-year vision for the adoption of new technologies to aid in content production, post and VFX. It is available as a free download, ( https://movielabs.com/production-specs/2030-vision-papers/ ), and lays out 10 Principles for a more efficient media pipeline using cloud infrastructure, zero trust security, and software-defined workflows. These principles act not just as a destination, but also as a roadmap for how to get there with a primary focus on empowering the creatives.



A presentation that I found particularly interesting as a Cinematographer was delivered by ARRI image scientists, Dr. Tamara Seybold and Carola Mayr. They have developed a system called ARRI Textures to give DPs more creative control of their images. These are looks that go far beyond color and can be set for the Alexa 35 during testing in preproduction to bake-in certain parameters of grain, contrast, and fine detail for various shooting situations. For example, a Cinematographer might choose to have lower detail and contrast on an aging actress, but high detail and contrast on a nature scene and anywhere in between. These are in addition to ARRI’s Reveal Color Science which are LUTs also designed for the Alexa 35.

Also of great personal interest as a DP was ZEISS’ Snehal Patel’s presentation on how to successfully integrate Lens Metadata into the Visual Effects Production pipeline. New tools, techniques, and R&D are opening up new doors to allow creatives to emulate better backgrounds, foregrounds, AR objects, and overlays. The results are more accurate, more realistic and closer to what audiences expect. He explained that the tools are there, and we just all need to learn how to take advantage of them.

And if all those presentations were not enough, the HPA Tech Retreat featured an Expo type area called the “Innovation Zone” where dozens of companies sponsored booths for one-on-one demos of their products. I got an update on several technologies I have previously covered including Riedel’s extremely versatile Bolero wireless intercom system which can support up to 250 belt packs,  each having six discreet channels, which is great for large crews. Besides serving as a camera intercom, Bolero can also function as a walkie-talkie and even supports Bluetooth to interface with your phone or allow communication with remote collaborators via the web.

I also caught up with my friends Steven Poster, ASC, Jim DeFilippis, and Professor Corey Carbonara, PhD, who are busy developing the 6P Color System technology to enable a workflow from camera to screen which preserves ALL visible colors, not just the colors within a standard color space such as P3 or Rec 2020. It is a complex set of technologies too broad to detail here, but more information can be found on their website: https://6pcolor.com

On top of that, I spent quite a bit of time visiting with Adobe, Frame.io, and Teradek to get up to speed on their latest Camera-2-Cloud technology, which we’ll be using to capture and post our NAB Show 2023 coverage. Teradek’s new Prism Mobile will help us take advantage of bonded cellular signals to reach the cloud and our editor back in Los Angeles, who will access our footage almost instantly via Frame.io and then cut on Adobe Premiere. We’re always eager to try the technologies we report on. In addition to attending amazing industry educational events like the HPA Tech Retreat, it is another way to keep ourselves and our members current on motion picture technology.


Submit a Comment

Your email address will not be published. Required fields are marked *


Recent Posts

Archived Posts