The journey to making the upcoming film Gods of Mars changed course dramatically once NVIDIA RTX Real-Time Rendering and virtual production entered the picture.
Peter Hyoguchi’s movie, which is currently in production, tells the story of a fighter pilot who leads a team against rebels in a battle on Mars, a planet now filled with cities after decades of terraforming. There is a mix of cinematic VFX with live-action elements to bring intergalactic scenes to the big screen.
The production pivoted once they saw the power of real-time NVIDIA RTX graphics and Unreal Engine. This is combined with a large LED Virtual Environment background on a soundstage. The Lucasfilm crews continue to use this on production of The Mandalorian, which I wrote about in November last year, and Marvel Studios will be using it on Taika Waititi’s Thor: Love and Thunder.
A fascinating back story exists regarding Hyoguchi’s passion for film making. Going back even a little further, Peter‘s mother worked for Hanna Barbera, as an animator on Scooby Doo and The Flintstones. “I’d watch the Saturday morning cartoons, and then she’d teach me how to do it,” he says. “She took me through art history, art theory, and well, she was my first teacher.”
Then when the family moved to Marin County in California, one his neighbours was one of the VFX Supervisors at Industrial Light & Magic. So, Hyoguchi had a front row seat in observing and learning how animatronics, stop-motion, as he says, he “got to meet everyone and learn everything,” just as computer effects began to take over movie-making. Chris Evans, Hyoguchi’s mentor, was Head of the Matte department at ILM. He teamed up with John Lasseter to make a movie we all remember as Young Sherlock Holmes.
“As a young kid, I was there during production,” adds Peter. “Just as I was maturing as film-maker, CGI was just becoming more available.” Cut ahead to a year ago and Hyoguchi was planning to do his own space-based SciFi film with practical models, miniatures. But then when he saw a demo of the Unreal Engine on a SmartStage, he finally could visualise his production workflow. Upon applying for a MegaGrant with Epic Games, he was on his way. Pivoting the physical model assets to digital opened many budget-saving options for the seasoned creative team.
Hyoguchi and producer Joan Webb used this Epic MegaGrant from Epic Games to bring together experienced VFX professionals and game developers to create the film. The virtual production started with scanning the miniature models and animating them in Unreal Engine, with real-time ray tracing. The Unreal Engine is seen as a total pivot point for cinema. Hyoguchi thinks everyone will use this technology for their effects in movies for the foreseeable future. In this case, Hyoguchi pulled together a collection of high-end technology providers, each eager to show off their creative abilities using the Unreal Engine. Using a combination of NVIDIA Quadro RTX 6000 GPU-powered Lenovo ThinkStation P920 workstations, ASUS ProArt Display PA32UCX-P monitors, Wacom Cintiq Pro 24s, Blackmagic Design cameras and DaVinci Resolve, to produce rich, photorealistic worlds in 4K, this was a coup for the young film-maker.
So, right now, they are building and delivering all the assets, sets and environments in preproduction, all ready for the live-action character shoot days of Gods of Mars. These are supported by LED walls with real-time rendered graphics created from Unreal Engine. Actors are filmed on-set, with a virtual background projected behind them.
To keep the set minimal, the team only builds what actors will physically interact with, and then they use the projected environment from Unreal Engine for the rest of the scenes.
One big advantage of working with digital environments and assets is its real-time lighting. With Unreal Engine, Hyoguchi can have scenes ray-trace rendered immediately with lights, shadows and colours. In earlier times, the assets would appear as grey boxes, with minimal form, no lighting or texture. Hyoguchi can move around the environment and see how everything would look in the scene, saving weeks of pre-planning.
“This project collaboration between Hyoguchi and all of these providers,” says Richard Kerris, GM of Media and Entertainment at NVIDIA, “is all about helping the creatives using the best technology they can find. We all want to make movies in a new and exciting way.” Kerris was close by to Peter at ILM in the early days, working across the way at Pixar Animation Studios. (He also ran ILM for a few years later on). “Anytime there are customers wanting to go places creatives haven’t been before, we take a real keen interest and want to be a part of that.”
“Ray tracing is what makes this movie possible,” said Hyoguchi. “With NVIDIA RTX and the ability to do real-time ray tracing, we can make a movie with low cost and fewer people, and yet I still have the flexibility to make more creative choices than I’ve ever had in my life.”