by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the May 2018 Digital Cinema Society eNewsletter)
As the droll comedian Steven Wright says, “You can’t have it all…where would you put it?” Yet, we Cinematographers generally do want it all, higher resolutions, higher frame rates, higher dynamic range and more color space. But the question of storage, (or where to put it?) is only one of the myriad of considerations. As evolving technology allows us to meet some of these demands, there is always a cost, and at some point we have to ask ourselves when is enough too much? I would like to consider resolution and ask, at what point does more, and more, and more, stop serving the purpose of improving image quality? And what is currently a good resolution plateau for acquisition and post?
For those of use who have been around awhile, the current discussions about resolution sound much like an echo of the past. The debate in terms of electronic capture and distribution formats has raged on for decades. Before they knew there was anything better, consumers were blissfully enjoying Standard Def TV. Home CRT screens didn’t get much bigger than 32”, and no one complained much about picture quality. As flat screen TV sets started to grow in size and drop in price, the need for more resolution started to become more apparent and once consumers got a taste of HD, they quickly developed a taste for it.
The Consumer Electronics industry was ecstatic, but not everyone was happy about this evolution. Broadcasters were dragged kicking and screaming into providing HD, and were forced to totally rebuild their entire infrastructures. Not only did they need to upgrade to deliver HD, but in a relatively short period of time they were also asked to transform from analog to digital. Another inconvenienced group were the content owners with vast libraries of Standard Def, who were forced to make costly conversions of their assets or assign them to the dust bin. This massive industrywide transformation is actually still in process.
Meanwhile, acquisition needed to be adjusted to feed the updated distribution channels. Although celluloid capture remained in a fairly steady state, the tools for electronic acquisition were adapting to keep pace. Around the turn of the century, (sounds like so long ago when you say it that way), we started shooting HD and by around 2005 it was firmly established. We also transitioned around the same time from recording on video tape to file based capture, and before any of these transformations were complete, 4K came knocking on the door in the form of the first RED Digital Cinema camera.
Around this same time, theatrical exhibition was also undergoing a massive change as it segued from celluloid to digital projection and distribution. While the exhibition transformation is now all but complete, a slower change has taken place regarding film as an acquisition format. While it is safe to say that the vast majority of theatrical features and virtually all of television is now captured digitally, there is still a small, but loyal group of filmmakers who are more than reluctant to give up on film, and although I don’t get a chance to shoot much film these days, I count myself as one of them.
Although the 4K distribution channels have been implemented for theatrical and web delivery, many high end productions still finish in 2K. Capture is another story; as of this writing, I would say that 4K is the de facto standard for professional acquisition. Producers who shot themselves in the foot by capturing in Standard Def when HD was on the horizon, don’t want to make the same mistake twice. And Netflix, who has become a dominant force in the Industry, mandates, with only a few exceptions, that material carried on their platform must be captured in 4K.
The once onerous technical barriers to 4K acquisition, such as cost of storage, long render times, and monitoring challenges have been mostly overcome, and 4K is now quite ubiquitous; even phones now shoot 4K. Lest we settle in and get too comfortable, we are starting to hear the rumble of 8K coming down the pike. 8K is no longer a science project, RED along with Panavision, and soon many others will be offering somewhat practical 8K capture solutions.
There are many benefits to acquiring at a higher resolution than the one you are finishing for. Stabilization and reframing options are available and downsampling only helps to improve picture quality. A host of new large sensor cameras and the lenses to cover them, also offer new visual choices. The Cinematographer can now control for extremely limited depth of field and also achieve subtle differences in perspective by taking advantage of different degrees of magnification while maintaining the same relative angle of view compared to S35.
Some predict that these advances in resolution will only continue, and at an ever-increasing pace owing to Moore’s Law and the resultant exponential progress being made in computing and electronics. However, while it might be soon possible to start shooting in 16K, 24K, 32K, and beyond, I would argue that we are quickly reaching the point of diminishing returns. Just because we can, doesn’t mean we should.
The limiting factor is no longer bandwidth, storage, or render times, but human visual acuity and the size of the display. Research tells us that a viewer would have to sit about 3 feet or closer from a 55-inch 4K TV to notice any real improvement over 1080p. We just can’t see much of a difference at normal viewing distances, whether in the cinema or in the home, (at least I can’t). Even at venues such as the Consumer Electronics Show, where they demo the latest and greatest models, there is only a subtle improvement to be perceived between a good quality HD and 4K display. And I simply cannot see a difference between 4K and 8K on a large screen display unless I walk up to within inches of the screen, hardly a comfortable viewing distance.
However, no one wants to get stuck with obsolete technology and as resolutions keep getting pushed higher, everyone tries to future proof their investment. Consumers purchasing large ticket items like TVs want a long useful life, so they will tend toward larger and higher resolution displays. Meanwhile, Producers also want to future proof their investment, which at this point means mastering in 4K. When 8K TVs hit the market, will the cycle begin anew?
One thing I have learned writing about motion picture technology for DCS over the last 15 years is that it is hard to predict future advances. However, I feel we’ve reached a point where we can now capture at 4K, (or sometimes a little above as necessary for special applications such as VFX, IMAX, etc.), and finish in 4K. I think, (or at least hope), that we are nearing a plateau where we can take a short breather from the relentless charge to increase resolution and work instead on improving other elements of visual quality. HDR, for example, seems to me a much more significant means of image enhancement than pushing for higher and higher resolutions. Of course, these technologies are not mutually exclusive, but I’m suggesting we concentrate our efforts toward making better pixels as opposed to just more pixels.
Like a kid in a candy store, as a Cinematographer, I still want it all, but let’s not let the tail wag the dog. Let’s instead let artistic expression drive the tools we select to create our images rather than constantly being forced to deal with a host of technical challenges that reap little if any benefit, only out of fear of obsolescence. Enough is enough.