The Many Uses and Benefits of Cine Lens Metadata and Communication

by | Mar 29, 2019 | Essays | 0 comments

JM Headshot2014Med
by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the March 2019 Digital Cinema Society eNewsletter)

MetadataArticleTitle3We Cinematographers can be a stubborn lot, sometimes set in our ways of doing things in just the same manner we have for many decades.  Trying to help us all get comfortable with new technology and ways of working is a big part of the mission of the Digital Cinema Society.  One field of technology that offers great potential, but has been slow to gain traction with professional Cinematographers, is the implementation of electronics built into modern lenses.

Digital cameras have essentially become computers with lenses attached.  Electronic communication with those lenses allows the camera to understand, control, and store the many parameters of the lens.  Lenses themselves have also become computerized, and when the two systems start working in tandem across various brands, many more benefits to the filmmaking process can accrue.

Many advancements in this area were spurred early on by the need to design inexpensive camcorders for the mass market.  Manufacturers had to figure out how to cheat the laws of photo optical physics by getting their cameras to electronically compensate for distortion, artifacts, and general weaknesses of the available lenses.  For example, if the lens suffered from severe ramping, the camera could be programmed to compensate the gain to mask the imperfection; if there was unacceptable barrel distortion on the wide end of the zoom, the camera could rectify the image; or if the lens was soft in some situations, digital sharpening could be added.  Today, even at the higher levels of motion image production, with digital cinematography marching to 8K resolution, even the very high-end performance lenses can benefit from small but precise in-camera compensations.  For example, controlling lateral chromatic aberrations are very important when capturing for high dynamic range and wide color gamut.

CanonLensWcontactsWhile working such issues out in the testbed of consumer camcorders was technologically impressive, it was still far easier to pair one camera and lens combination than achieving such abilities with interchangeable lenses and between different camera manufacturers.  The first steps involved getting the cameras to talk to the various lenses, and this has been going on for quite some time.

CanonBunchOfLensesCanon introduced the EF mount, (an acronym for “Electro-Focus”) in 1987 and has since produced over 100 million of these interchangeable lenses.  Other lens and camera manufacturers have also adopted this standard, including Zeiss, Sigma, Schneider Optics, Panasonic, and Blackmagic Design, to name a few.  (I should point out, however, that this implementation was by reverse engineering, since EF communication is technically Canon proprietary).  Although EF lenses were originally designed for still photography, they have more recently also been employed for motion.

TouchScreenAuto-focus was originally pretty clunky and the design of these lenses was not conducive to switching between physical control on the lens barrel and electronic control through the camera. However, these functions have recently become more subtle, and directable.  They even offer the ability to use a touch screen to identify the point of focus, and the addition of artificial intelligence now allows for facial recognition to follow a subject as a moving target.  Complicated focus pulls can be programed and even wirelessly controlled remotely via a tablet with touch screen interface, (especially handy when the camera is mounted on a boom or in a drone).  However, even though the electronics in the cameras and lenses have improved over time to better interface and take advantage of a host of new capabilities, professional Cinematographers have been loathe to give up control to the camera.

PL Mount Image 01The venerable PL mount, (for “positive lock”), first introduced by ARRI in the early 1980s has become a standard for motion picture cameras, first film, then followed by digital cinema cameras and lenses.  However, these did not at first include any kind of electronic links for communication between the lens and camera.   With the exception of Panavision, who created their own propriety “P” mount, just about every manufacturer of high-end motion picture lenses produced them with a PL mount.

InterviewLesZellenOne of these companies was Cooke Optics, whose CEO, Les Zellan, has been working tirelessly for over 20 years to bring the benefits of lens metadata to lens mounts such as PL.  He helped to create /i Technology, a semi-open metadata protocol for gathering and sharing lens data.  (The licensing fee to other manufacturers is one British Pound per year).  AlexaLF LensAdaptors

Meanwhile, ARRI, the creator of the PL mount, was working on their own system known as LDS.  One or both of these systems is now included in most popular lenses. (The follow up to the PL, ARRI’s new “LPL” mount system, designed to cover the new large sensors, also includes LDS and /i compatibility).

Sony venice pl front 4 420x581Information, including basic iris, focus, and focal length, can be digitally recorded for every frame and stored as metadata, accessible via cable connector near the lens mount and/or contacts in the PL mount that can sync with cameras and other equipment.  Although the tools to collect vast quantities of valuable information have been there for quite some time, and can be routinely obtained during acquisition, it is still not widely done. According to Zellan, “Metadata will be most useful when it has become trivial to collect and, therefore, becomes ubiquitous.”

Virtual Cinematography

One great way lens data can be used in service of movie making is in the area of what is sometimes called Virtual Cinematography, realtime integration of live action elements with computer generated imagery.  As an example, allow me to share details from one of my own productions.  A few years back, I was tasked with coming up with an economical way to shoot a pilot for a live action Kids show using all virtual sets.  Previously, that would have meant either a completely static camera, locked off with no focus adjustment, or else very elaborate post VFX, which just weren’t in the cards for this production.

I remembered seeing a demo by longtime DCS member, Eliot Mack, of his Lightcraft Technology “Previzion” system.  It was designed for high-end productions to comp digital images live on-the-set so that filmmakers could visualize their scenes and preview how their live action interfaced with virtual backgrounds in advance of the fully finished VFX.  They were also collecting lens data that could be used to help create those final comps in post.

TribeStageBScreen3In the process, they were creating pretty good looking HD images, which certainly seemed to be acceptable for the kind of show I was working on.  So I pitched the Lightcraft system to the Producers, and a test was arranged.  They loved the economical aspects of the system that would allow for very sophisticated visual effects on a budget and schedule that would not otherwise be possible.  However, what I loved about it, and the reason I recommended it, was because it also freed the camera.   By feeding the lenses’ basic metadata combined with geometric orientation, and spacial location details into a computer, the system was able to move the live action images in sync with the virtual background.

LightCraftSensor1The geolocation data was collected by way of a separate sensor, (a small Black & White Camera), positioned on top of the principal camera that read a series of markers mounted on the ceiling across the set.  All of this data was then fed to the Lightcraft computer to tell it frame by frame how the camera was positioned in space including tilt, pan, height, together with lens and camera metadata.

This alleviated the need to manually record the data for later compositing, employ a motion control system, or use tracking markers placed in the shot, which would have needed to be painstakingly removed in post.  More importantly to me, this ability for the virtual background to change in realtime in sync to the camera meant the camera didn’t have to be locked.

 

TribeStageBScreenI was able to zoom, focus, and do any manner of live, physical camera movement, even handheld, and the background followed in lockstep. Since I like to keep the camera mobile and frequently use a jib-arm, this approach was very appealing and worked great for this application.  The pilot was shot and I feel like it probably would have been a huge hit if the distributor, Relativity, had not gone into bankruptcy just as it was ready to hit the market.

MathersCabrio1At any rate, this would not have been technologically possible without lens metadata.  Luckily, my Fujinon Cabrio 19-90mm zoom features both LDS and /i metadata compatibility.  It was able to transmit focal length, f-stop, and focus in realtime directly to the computer. It also afforded a broad focal range, and since each lens change would entail significant re-calibration of the system, it was really good to have one lens that could achieve both wide shots and close-ups.

TeradekLens metadata can also be extremely useful on higher end productions.  On the set, a “smart” lens can communicate with other camera equipment in order to automatically and instantly calibrate controls for that lens when you plug it in, thus saving preparation time.  Lens information can also be displayed as overlays on field production monitors from such manufacturers as SmallHD. These graphical overlays are rendered with virtually zero delay on wireless systems such as Teradek RT CTRL.1.  This allows a Focus Puller to keep their eyes locked on the same display as the camera’s output while also seeing the distance, iris, and focal length.  Some systems even show realtime depth of field calculations, so an AC knows just how far they can push the limits of focus.

Post VFX

However, post production is where lens metadata can provide even more crucial information for complex VFX work.  Working with tracking software solutions on shots that would otherwise have to be done manually. Lens metadata details frame by frame how footage was shot for future reconstruction work.

ZeissSupremeGroupLogoCarl Zeiss is another company leading the charge and has developed lens metadata to provide even more benefits.  In addition to the information provided by i/technology, ZEISS eXtended Data protocol also includes frame-by-frame information on lens vignetting and distortion.  This data can be extremely helpful in visual effects image continuity, where elements of distortion, which give a lens its character, can be removed, manipulated, and selectively replaced overall in the final composited shot to better blend foreground and background.  These additions are integrated into Zeiss’ CP.3 XD line of primes, and their new high-end Supreme Primes, (13 high-speed lenses, from 15mm to 200mm, most with an aperture of T1.5, and all with full frame coverage).

ScottAnderson2Scott E. Anderson, Visual Effects Specialist, Producer and Director with credits on such major releases as The Abyss, Terminator 2, and Babe, and who was nominated for an Academy Award® for Hollow Man and Starship Troopers explained the importance in a DCS interview conducted as part of our last Post Production Expo.   According to Anderson, a filmmaking team will choose lenses to create a certain look or a certain feel for a project, and it is part of the job of the visual effects artist to match that look.

ZeissLensW dataConnectionThey need to discern what the characteristics of the lenses are, and how to make the synthetic visual effects elements match the live action of a composite in order to come together as one seamless piece.  Much of this is based on understanding the characteristics of a lens in terms of its distortion and shading qualities, how it handles out-of-focus, color rendition, and any artifacts.

ZeissLensDistortionScreen2Since this process is so important in order to match the character of a specific lens, they are pretty much constantly measuring lenses. This was traditionally done in prep by shooting innumerable charts of every lens to be used at various focal length, iris, and focus settings.  However, with the Zeiss eXtended metadata system, there is no need to take up that valuable time.  The information is chronicled and recorded as lens metadata on a frame-by-frame basis.

GameofthronesWhether it be motion pictures or television, no one is doing higher quality, or a higher volume, of visual effects than a show like HBO’s Game of Thrones.  As I was preparing this article, I was lucky enough to run into my old friend Stephen Beres at an industry function.  Stephen is the Senior VP, Studio & Production Services at HBO and part of his job is to try to insure the best use of technology to efficiently and economically produce high quality content.  He shared that they have started to employ lens metadata on effects heavy shows like Game of Thrones, and it seems to be helping, so he is encouraging HBO filmmakers to take advantage of these benefits.

Netflix is another entity with an interest in improving the production and post process.  They are working to establish technical standards for content distributed on their platform in areas such as resolution and HDR in order to guarantee a superior level of quality for their subscribers.  Although they have not yet mandated the use of lens metadata on the shows they produce, they have been very proactive in helping to establish other industry standards and it would not surprise me to see this for visual effects laden shows in the future.

So What’s The Sticking Point?

Manufacturers are cooperating to maintain open standards, and VFX Artists, Producers, and outlets like HBO and Netflix all want to see lens metadata fully implemented.  The tools to collect vast quantities of valuable information have been there for quite some time, and could be routinely obtained during acquisition, but it is currently happening on very few productions. Meanwhile, with a trend toward larger sensor sizes and the resulting shallower depth of field, Cinematographers need all the help we can get in order to keep our images in focus.  So what’s the hold up; why are these benefits of technology not being realized?

I believe it may be partially due to that fact that we Cinematographers can be a stubborn lot.  Admittedly, we are sometimes more resistant to change than we should be, but there is more to it.  In the case of focus tools, it is likely that not many Cinematographers have recently tried the latest focus assist systems.  There are so many new options now, from full Auto Focus for certain challenging shots to using only a Focus Guide, a hybrid combination of tools where control of focus is still in the hands of the Cinematographer.

Keeping control, or the fear of losing it, is also a stumbling block toward a broader implementation of lens metadata.  Cinematographers have traditionally had fairly complete discretion to select what lenses we would use.  And in the digital age, when so much about an image can be manipulated in post, the choice of lenses is one of the last remaining decisions we can make to control the look of our images.  This is a choice we don’t want to easily defer to others.

When a wide variety of cine lenses can all offer some level of focus assistance, and we can choose between different types of lenses with unique characteristics, but all offering the benefits of metadata, our freedom of choice will be maintained.

Thankfully, camera and lens manufacturers have been fairly open and continue to work together toward more interoperable hardware and software standards.  More lenses will continue to be introduced with advanced capabilities, and a nice thing about smart lenses already on the market, is that they can be easily made current with firmware updates, so the tools and technology will only keep improving.  When, as Les Zellan says, the technology becomes “ubiquitous,” we will all enjoy the benefits of the hard work that has gone into developing these smart lens technologies.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Categories


Recent Posts


Archived Posts