VFX Oscar contenders using more Machine-Learning and Artificial Intelligence
Visual effects studios Industrial Light & Magic and Digital Domain spent years developing the effects for two of the Oscar-nominated movies for best VFX this year.
They are just two of a growing number of VFX studios and apps also adopting AI tools accelerated by NVIDIA RTX GPUs. In fact, two nominees in the visual effects category, The Irishman and Avengers: Endgame, used AI to push the boundaries between human actors and digital characters.
The Irishman
Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes from different times in his life, from World War II to a nursing home early this century, and every decade in between.
But all three lead actors in the film: Robert DeNiro, Al Pacino, and Joe Pesci — are in their 70s. A makeup department couldn’t realistically transform the actors back to their 20s and 30s, and director Martin Scorcese was against using the typical motion capture markers during filming. He didn’t want anything getting in the way of the performances. The de-aging would have to be done after the shoot, in the workstations at the studios, and this was driven by the new NVIDIA Quadro RTX GPUs.
ILM developed a new three-camera rig to capture the actors’ performances on set. They had the director’s camera flanked by two infrared cameras recording 3D geometry and textures. The team also developed software they called ILM Facefinder. This employed AI to sift through thousands of images of the actors’ past performances in movies with their faces closer to the time they wanted to depict.
The tool located frames that matched the camera angle, framing, lighting, and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.
Reading from the latest blog in at the NVIDIA site, I read that Pablo Helman, VFX Supervisor on The Irishman at ILM said, “AI and machine learning are becoming a part of everything we do in VFX. Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”
Avengers: Endgame
There are over 2,500 visual effects shots in Avengers: Endgame. VFX teams at ILM, Weta Digital and many other studios worked tirelessly to bring these effects to the screen. Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of Thanos.

Bringing the infamous villain to life was assisted by a machine learning system called Masquerade, developed to take low-resolution scans of the actor’s performance and facial movements. These scans of his expressions were transferred onto the high-resolution mesh of Thanos’ face to help generate a realistic, emoting digital human.
Darren Hendler, head of Digital Humans at Digital Domain said NVIDIA RTX technology is being used to drive all of their real-time ray-traced digital human projects.
Behind the scenes, Netflix has developed AI-powered recommendation engines using NVIDIA RTX technologies to provide customers with personalized content based on their viewing history. That certainly would be interesting for someone like me who watches anything and everything on that service. Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths.
Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be sending some great speakers to NVIDIA’s GPU Technology Conference in San Jose, March 23-26. (Early pricing ends Feb 13).
Related links: