Excuse Me, I Think Your Camera is Ringing — iPhones for Movie Making Part II
by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the October 2020 Digital Cinema Society eNewsletter)
The iPhone is so prevalent, especially in the entertainment industry, that it seems like it has always been a part of our lives. But let’s start out with a little historical perspective. In fact, the Digital Cinema Society, now going into our 18th year, predates the iPhone, and this is not the first time I have devoted a monthly DCS column to its use in motion pictures. It was back in 2014, when Ridley Scott, (who created Apple’s famous Orwellian 1984 Super Bowl commercial), was commissioned to come up with another spot for the 2014 game featuring only footage shot with the iPhone. Meanwhile, the Sundance contender Tangerine by Sean Baker, (The Florida Project), was shooting exclusively with an iPhone 5s and Korean filmmaker Park Chan-wook, (The Handmaiden, Snowpiercer), was experimenting with the format on the fantasy-horror short, Night Fishing.
So, shooting movies on an iPhone is nothing new, but with the latest release of the iPhone 12, and all the new features that are particularly useful to filmmakers, I felt like I had to reprise and update the exploration of the iPhone for movie making.
Like all consumer electronics technology, our personal “communication devices,” (it’s hard to refer to them as mobile phones when they now serve so many functions), are advancing at an ever-accelerating pace. In fact, every advertised item from this 1991 Radio Shack ad has existed for some time in even higher quality, inside a single smart phone. Of course the innovation continues, with the latest iteration of the iPhone. Having a 4K camcorder with decent sensitivity and dynamic range is just the start. It is also a wirelessly internet-connected computer that easily slips into your pocket, but now it is going much further by taking advantage of the abilities created by geolocating accelerometers and Lidar scanners.
Let’s start with a look at the iPhone’s latest camera, (or should I say cameras?). The new lineup includes the iPhone 12, iPhone 12 Mini, iPhone 12 Pro and iPhone 12 Pro Max. With so many models, it can get a little confusing about what differentiates them from each other, but the included camera capability is one important factor. In general, the iPhone 12 and 12 Mini are the two most affordable phones in the lineup and each has the usual front facing “selfie-cam” in addition to dual rear cameras. However, the higher end Pro models feature an additional third telephoto camera, and more storage, up to 512GB, (which will come in handy for motion picture use).
The Pro models also feature Dolby Vision HDR and a new Apple ProRAW recording format. Oscar-winning cinematographer Emmanuel Lubezki, ASC made a short movie with the iPhone 12 Pro which was screened as part of Apple’s launch event. You can talk about stats all you want, but to me there is no better test of a capture device’s capabilities than to put it in the hands of a master artist like “Chivo,” as he is affectionately known.
Lubezki is an eight-time Oscar nominee for Best Cinematography and a three-time winner, (Gravity, Birdman, and The Revenant.) He won his three Oscars in three consecutive years, a singular accomplishment which no one in the history of the Academy Awards has ever achieved. But neither can credits tell you the quality of the tools the artist is using; you have to see the results for yourself, so links to his beautiful Apple demo and a behind-the-scenes piece follow this essay.
Another feature that really sets the Pro models apart is the Lidar scanning capability for modeling and object detection. Lidar technology, whose development was greatly accelerated by virtue of its use in self driving cars, opens up a whole set of new uses in the areas of VR, AR, and cinematic visual effects. Lidar, an acronym for “Laser Imaging Detection And Ranging is, according to Wikipedia, a method for measuring distances by illuminating the target with laser light and measuring the reflection with a sensor. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target. This information combined with the iPhone’s photographic sensor and camera-to-subject distance data allows for the creation of a depth map which is key to manipulating the image in post. For example, it allows for such abilities as choosing the point of focus in an image in post and is key to opening up many more abilities yet to be realized.
When I’m struggling to understand the more scientific aspects of cinema technology, especially relating to visual effects, I turn to my friend David Stump, ASC. He is not only a Cinematographer and Visual Effects Supervisor, but also an all around fountain of cinematic knowledge with a long list of notable productions, an Emmy, and an Academy Award for Scientific and Technical Achievement to his credit. David, who has worked extensively with Lidar when testing for Lytro, tells me that the iPhone in itself is not currently a tool he could take into the field and use to help create visual effects. However, the fact that this technology is now out of the lab and in the hands of users as a consumer electronics device will help educate a new generation of filmmakers who will find further creative ways to harness these abilities. Light field technology was behind the Lytro Cinema system, which is now defunct after selling to Google, so it can’t be long until we also see this incorporated into an Android phone.
Meanwhile, Samsung Electronics has released Untact, an 8K movie shot mainly on its Galaxy S20 and Note 20 smart phones in order to promote 8K-technology and its ecosystem to consumers. The project can only be viewed in full resolution in Samsung’s two 8K cinemas in Seoul and at lower res on Samsung’s YouTube channel. Although I’m concentrating on the iPhone in this essay, other mobile platforms are also developing at an incredible pace.
All the new iPhone models are 5G enabled, and the potential benefits of 5G are significant, although aspirational at this point, since the infrastructure to carry the signal has yet to be widely deployed. In time, it will open up a whole range of cloud enabled capabilities. According to our resident workflow guru, Michael Cioni, senior vp innovation at Frame.io, “Some of the major opportunities 5G brings to our community is the millimeter wave which refers to higher frequencies than we’ve been able to use in consumer telecommunications before (24 and 40GHz). At these frequencies, speeds can now explode from megabits per second to gigabits per second, theoretically hitting 20gigabits – which would be hundreds of times the bandwidth we have today. These are short-range waves, so they would be the types of installations on movie studio lots, production offices, shooting stages, production companies, and generally major buildings in major cities. While this won’t cover all the needs for production, a large percentage of production takes place in or around studio lots and mmW will be a huge asset.”
Cioni adds that “5G puts the emphasis on the need for wireless technology to catch up to filmmaking needs at filmmaking quality. But the transition will be something like cutting film to Avid or going from tape to files; it will be a process in which people slowly open up to relying more and more on 5G. Today, Frame.io is relying on 4G LTE for our Frame.io CLOUD solution that instantly transmits and shares proxy clips from the camera to anyone in your production. But the quality and access will only go up with 5G. Ultimately, as the bandwidth increases and the network reach expands, we will switch from proxies to original camera files. This means we will instantly shoot to the cloud and therefore an archive and backup will exist without ever having to download anything or ship from the set. This is still 5-7 years away, but the building blocks are now all coming into focus.”
The iPhone has been capable of basic video editing for some time via iMovie, but new capabilities are constantly being added, and other professional NLE have been converted for use on mobile devices. Adobe has brought the power of their Creative Cloud to the iPhone with Adobe Premiere Rush. There were several updates announced at their recent Adobe Max, including all-new graphics and an audio browsing experience, complete with hundreds of royalty-free soundtracks, sound effects, loops, transitions, animated titles and more. Three sophisticated new transitions include Push, Slide, and Wipe. Plus, Pan and Zoom and Auto Reframe are now available on mobile platforms. Many of these features are enabled by AI, such as Auto Reframe’s use of Adobe Sensei to intelligently identify the key point of interest in the frame, reframing the clips when switching between different aspect ratios. Once it has identified the point of interest in the frame, it tracks it throughout the reframed video, keeping the important parts of your shot in the frame.
Let’s also not forget some of the great cinematography apps which are constantly being updated and improved, (sometimes to the chagrin of the app developers who are forced to rewrite their code when Apple does a major iOS update.) There are Cine-Calculators like David Eubanks’ pCAM, the Helios Sun Position Calculator, and free storage calculators like AJA DataCalc. DCS Advisory Board member Adam Wilt’s Cine Meter and FieldMonitor apps have recently seen major updates.
Since the last time I wrote about Adam’s apps, Cine Meter II has added a spot meter function and the ability to show green/magenta tint as a Wratten CC value, a plusgreen/minusgreen value, or as ∆uv. FieldMonitor is now compatible with Lumix S-series (full-frame) cameras and Canon EOS mirrorless cameras as well as DSLRs. Focus assist options now include red and cyan digital peaking; you can show the waveform monitor and other ’scopes with or without the effect of monitoring LUTs. Then there are the apps that offer more professional cinematic type control over the iPhone capture such as FiLMiC Pro and Mavis when you are ready to go beyond auto everything.
Steven Soderbergh’s last two features, Unsane and High Flying Bird were both shot entirely on the iPhone. Unsane starring Claire Foy from The Crown was mainly shot in one location, while High Flying Bird starring André Holland, features various locations throughout the greater New York area. Soderbergh has been quoted as saying, “I think this is the future. Anybody going to see this movie who has no idea of the backstory to the production will have no idea this was shot on the phone. That’s not part of the conceit.”
Emmanuel Lubezki also had kind words, “When I started shooting movies, you had to rent a very expensive camera, buy film stock, pay for developing, special equipment for editing. Now, you can really go out with one of these devices and make a movie. I think the next great cinematographer or the next great film director is already making movies with one of these devices.”
Personally, I’m not ready to give up my cinema cameras and shoot exclusively with the iPhone, but it sure is handy to pull one out of my pocket on a location scout or rehearsal and get a pretty darn good image, with very little effort. Although the included technologies such as 5G, Lidar, and image quality features like Dolby HDR are still maturing, the future looks bright for iPhone movie making. The iPhone has certainly come a long way since its debut in June 2007, not to mention that you can also use it to make phone calls, which in itself, is pretty amazing technology.
A short film made by Emmanuel Lubezki on the iPhone 12 Pro:
Behind the scenes: