I’ve previously written a little about HDR, and have even, (somewhat reluctantly,) served on a couple of Industry panels, but that certainly does not make me an expert on the subject. The truth is that as a Cinematographer, you don’t really need to be. HDR is a display format rather than a capture format. To use an expression coined by my fellow Cinematographer friend, Bill Bennett, ASC, “we’ve been shooting HDR for years.” That’s because film, and more recently high end Digital Cinema cameras, have already been able to capture the necessary dynamic range for HDR. The only problem is that the display technology has not previously been capable of showing it.
Now, such technology is on the near horizon and the Industry is trying to prepare for another transition. We can sometimes feel overwhelmed by the seemingly constant parade of such impending transitions. The Industry is still adjusting to the move from Celluloid to Digital and SD to HD and now we’re being prodded to go from HD to 4K and beyond, not to mention 3D, High Frame Rate and VR. Some of these “advancements” and technologies are more compelling than others, and it sometimes feels like it is just an exercise in selling new TVs, but I’m here to tell you that HDR is the real deal.
On the average size home display and viewing distance, it can be hard enough to distinguish HD from SD, harder still to discern 4K from HD, and pretty much impossible to pick out 8K from 4K, but turn off the HDR and you’ll have viewers asking what went wrong with the picture. In my opinion, HDR trumps High Resolution, High Frame Rate, Wide Color Gamut, and 3D as a technology that really improves the viewing experience. HDR is more than just showing brighter highlights and a greater range between dark and light, it adds depth to the image, making it more dimensional and immersive.
Just as there are huge benefits, there are also huge challenges and some trepidation within the industry to meet these challenges. One issue is preserving creative intent. Although consumers are not yet experiencing much HDR in the home, they soon will be, and the images we capture today will likely be viewed in HDR in the not too distant future. However, on-set HDR monitoring options are not always practical, so how do we evaluate what we’re capturing to know how it will eventually be displayed?
There are only a handful of true professional grade HDR displays from companies such as Sony, Canon, and Dolby. Although they produce exquisite images, they are quite pricey and delicate, seeming more suited to the DI suite than the set. An exception is a new offering from SmallHD. It comes in at a fraction of the cost of the other Pro HDR displays and their demo video shows it still operating after being slammed with a baseball bat, knocked to the ground, and driven over by a truck. While that may be a little extreme, it does bring up the point that field monitors need to be rugged.
You also need a proper viewing environment, which can be tough with the kind of narrative location based shooting I usually do, but with HDR, a controlled viewing environment is even more critical. The best approach for me seems to be to capture as if I were shooting film, relying on my meter, and after testing through to post and familiarity with the camera, I know where I want things to fall. Although there may not be too much differing methodology on the set for me as a Cinematographer, I do need to be involved in post in order to protect my creative choices. Maybe I want my tonal range compressed with crushed blacks or clipped highlights. It could be for creative or practical reasons, like maybe I want to keep the cables on the floor buried in the shadows, or perhaps I want a window to blow out so I don’t see craft service or some other unwanted items in the background.
Sticking to an ACES pipeline and being able to supervise the DI is my best bet to insure my images are eventually seen the way I want. However Producers are increasingly reticent to bring Cinematographers in for the DI, and it is not at all common for them to be compensated for their time. Once you’re on to your next paying gig, it can be hard to make time to leave it in order to color time your last movie.
Many vintage titles are now being restored and given an HDR treatment, and more often than not, these Filmmakers are no longer available to join in the creative finishing process. Let’s just hope these archivists and post professionals are sensitive to the original filmmakers’ intent and don’t overdo the new abilities; (not everything needs to be cranked up to 100 Nits just to show off that it is HDR).
Another issue is that there is currently no single standard for display or multi-platform distribution. There are actually two main standards that are currently competing to specify everything that HDR should be. One is known as HDR 10, which was developed by the Ultra HD Blu-ray task force, and subsequently adopted by the UHD Alliance, a broad consortium of stakeholders including TV manufacturers and studios to certify premium 4K TVs. It is an open standard, as opposed to the other offering from Dolby known as Dolby Vision, (or its theatrical equivalent Dolby Cinema).
The Dolby system is a proprietary format that requires all elements in the image chain to use Dolby licensed technology. Dolby would collect fees from equipment manufacturers who want to be certified; from post and distribution facilities, from theatrical exhibitors, all the way down to fees from the consumer’s TV and or set-top box. These will be fees that are hidden from the consumer, but that they will eventually have no choice but to pay.
This has been a very successful financial model for Dolby on the audio side for several decades. Hence they have a lot of cash to throw around for things like rebranding the iconic Hollywood and Highland theater complex from the Kodak to the Dolby Theater. They also have the cash to bankroll a push within the Industry to adopt their HDR platform.
Both formats use the same basic technical methodology, embedding metadata that indicates to the Projector or TV how the images should be rendered on the screen. However, they are incompatible, and having multiple formats creates confusion and unnecessary duplication. For example, content providers such as Netflix and Amazon are backing both formats, and Producers, wanting to keep all distribution options open, must often create a deliverable for each. TV manufacturers such as LG and Vizio are also having to make their new TVs compatible with both formats, which obviously increases costs.
Standards are crucial to facilitating interoperability between the myriad of components from the camera all the way to the final display, and I’m not suggesting we shouldn’t eventually settle on a single one. However, imposing one too soon can hamper innovation. An open source system invites creative contributions that may improve a still developing technology. With their Industry clout and standards implementation experience, perhaps Dolby is the right team to get the job done, but let’s just all be aware of the process and the cost.
Once HDR gets sorted out, it will be a boon to both creatives and viewers, and it’s coming sooner than later. Dolby estimates that there will be over one hundred titles available to stream by the end of the year with more on the way. In the meantime, if you need a new TV, you might consider a set that can handle both Dolby Vision and HDR10. And if you’re a Cinematographer it is now more important than ever to be involved in the DI mastering. Get ready, whether you like it or not, content will soon be coming at you in glorious HDR.