SIGGRAPH and its MVP Exhibitor: NVIDIA
(Excerpted from the Digital Cinema Society eNewsletter, August 2015)
by James Mathers
Cinematographer and Founder of the Digital Cinema Society
I try never to miss the annual interdisciplinary educational experience of computer graphics technology known as SIGGRAPH, (Special Interest Group on GRAPHics and Interactive Techniques.) The 42nd conference and exhibition returned this year to Los Angeles and I was lucky to be among the nearly 15,000 in attendance. It’s a place where scores of breakthrough innovations are announced and demoed offering hands-experiences with emerging technologies.
This year saw the debut of “VR Village” dedicated to real-time immersive Virtual Reality/Augmented Reality content. The distinction between the two is that Virtual Reality, or “VR,” puts you entirely in a computer-generated world, most often via a head-mounted display. Meanwhile, “AR,” or Augmented Reality is in a more open environment and only partly immersive, so you can see through and around it. It may include superimposed computer-generated images over the real world. For example, you may experience it in the form of glasses that let most of the real world in, but then beam some images into your sightline so that they appear to be floating in space.
Full-Dome Cinema was also on display in the VR Village and featured live demonstrations in a 360-degree immersion dome. Wandering through these exhibits, attendees got to ponder future technology for telling stories, engaging audiences, and powering real-world applications in other fields such as health, education, design, and gaming.
One thing I really like about SIGGRAPH is that compared to other exhibitions, it seems a lot more about sharing and celebrating technology rather than selling it. Of course companies need to make potential customers aware of their products, but at SIGGRAPH you’ll only find the very soft-sell, and many of the presentations are put on by volunteers, user groups, and computer science departments from leading universities around the globe.
Exponential progress in computing power is driving an ever-expanding computer graphics industry encompassing everything from video games, movie production, product design, to medical diagnosis and scientific research. At the core of this progress is the GPU, (Graphics Processing Unit,) which was first developed in 1999 by NVIDIA, who we are proud to call a DCS supporter. So, it not surprising that their technology seemed to be behind some of the most interesting exhibits at SIGGRAPH.
Products such as their “Iray,” a GPU-accelerated rendering engine, was at the heart of demoes such as their “Real or Rendered?” where attendees got to compare computer generated images side by side with actual objects photographed with a live camera on high resolution monitors; more often than not, they couldn’t tell the difference.
Another interesting demo featuring NVIDIA technology is known as the Ford Immersive Vehicle Environment (FIVE). Ford designers use this highly realistic virtual reality system to see and understand complex engineering issues from both common or completely unique perspectives. Ford is now able to evaluate a vehicle in real time, at full scale and context, even digitally sculpt models of their products before creating physical prototypes. This allows them to consider aesthetic design, fit and finish, manufacturability, and maintenance requirements in a virtual workbox.
The demo features a head-mounted display which puts a 4K interactive multidimensional image in full view while also tracking eye movement to drive the CGI. In addition, the viewer is handed virtual tools, which mimic real tools like a flashlight and pointer used to highlight certain areas of view where they are aimed. The demo starts with you sitting on a chair, with a highly realistic view making you feel that you are inside a new Ford Mustang. You can turn around and look behind, lean over, under the dash, and brighten up any dark corners with the “Flashlight.”
The demo subject is then invited to stand up, “get out of the car,” and walk around the perimeter with a highly realistic view of a car that is not physically there. What is really mind-blowing is when you get up close to the virtual car and then delve deeper, seeing through the surface layers into the guts of the vehicle, even into the engine itself. The link below is to a video that might give you a better idea of what’s going on, except that it seems to have been produced before the latest innovation and doesn’t show the Superman-like X-ray vision feature. https://vimeo.com/131942417
Other NVIDIA tools on display enable VR for cinema and gaming. A great example of what’s possible was demonstrated in their booth where viewers were able to try on the latest Crescent Bay prototype headset from Oculus and step inside “Thief in the Shadows.” This demo, created by Weta Digital and Epic Games, offered a fully immersive experience set in the “Lord of the Rings” universe. Viewers were able to take on the role of a hobbit prowling for treasure in a dragon’s lair. It was a very visceral experience, where you found yourself physically turning around and craning your neck to be able to take everything in.
As an Indie Filmmaker, another exhibit that caught my eye was a new, extremely affordable Motion Capture, (Mo-Cap for short,) system from a company called Reallusion. Mo-Cap is the process of recording the movement of objects or people. and using that information to animate digital character models for computer animation. Their iClone range products work with other popular image creation software from companies like Autodesk to import, animate and export content to any game engine or 3D application. Characters created with this software are fully rigged and ready for both animation and lip sync.
What’s really interesting is that iClone can be combined with a Perception Neuron Mo-Cap suit, developed as a Kickstarter campaign by a Chinese company called Noitom. This allows for a very affordable Mo-Cap solution where you can see your motion performance reflected in real-time with your in-game characters, all at a price of only $2,000. (Their system is designed to work with a variety of hardware, but when I asked what was powering the demo, again it was an NVIDIA GPU). Reallusion aims to democratize Mo-Cap production the way companies like Blackmagic Design have expanded the user base of Digital Cinema cameras.
Speaking of Blackmagic, after wandering the floor of SIGGRAPH for many hours, a fish out of water Cinematographer at a computer exhibition, I finally felt at home when I got to their booth. Blackmagic is into so many different technologies, and I’m still not sure what they were doing there, but at any rate, I was glad to catch up on their latest products. I got to play around with their new URSA Mini, which really solves a lot of the issues I had when I tested the original URSA. Improvements include a compact and lightweight form factor, a new sensor and their own fully featured lightweight EVF. It also has a 5 inch fold out LCD viewfinder along with dual RAW and ProRes recording capability. The new sensor is Super 35, 4.6K with a global shutter, and is said to reach up to 15 stops of dynamic range. It should be shipping any day now with prices ranging from US$2,995 for the URSA Mini 4K EF, and US$5,495 for the URSA Mini 4.6K PL model. Of course these also include a full version of the latest DaVinci Resolve software.
Lastly I will report that I was curious to see a very robust Jobs Fair taking place as part of SIGGRAPH; (now that’s something you would never see at Cine Gear Expo or NAB.) I was a little jealous of the computer artists, designers, and technicians who were being actively recruited by seemingly every major animation studio including the likes of Warner Brothers, Dreamworks, and Pixar. I’m probably a little too far down the road myself, but those coming up in the entertainment industry might want to pay attention to where the demand is coming from and adjust their skills to suit. Computer animation seems to be a real growth industry.