Netflix BLACK MIRROR serves up Metalhead, a black & white horror like no other!
Anyone who has lost themselves in Black Mirror, the horror series on Netflix, has perhaps seen the story in Season 5 Episode 4, with title of Metalhead. While the monochrome nods to Fellini, the Twilight Zone -inspired shattering horror storyline is pretty gripping. But what really took my attention was the metallic black dog which is featured throughout the chase. As soon as I saw the show, I got in touch with Mike Bell, the VFX Supervisor for the episode, based at the DNEG VFX studio in London.
Last month the DNEG TV team took home the BAFTA Craft Award in the ‘Special, Visual and Graphic Effects’ category at a ceremony in London. This is the fifth award that DNEG has won in the last few months, having previously been honoured with an Oscar and a BAFTA for its work on Blade Runner 2049, and two Visual Effects Society awards, for Blade Runner 2049 and Christopher Nolan’s Dunkirk.
DNEG TV’s Black Mirror team created a photo-realistic quadruped robot, built from technology that could conceivably exist in the near future. The team created over 300 shots for the episode, including 200 shots involving bespoke character animation.
First up, I asked Mike Bell what research on the form of the dog did they do, how it walked, ran, balanced and justled about on the ground, and yes, they used Boston Dynamics as a reference, as well as Festo Robotics who actually base their work on real world animals and have some really interesting reference material available on the web as well.
The Metalhead dog has a skin finish which is a gun metal and makes it look battle-ready. It was modelled and rigged in Maya and the texturing was done using Mari. The final model comprised of 1,500 separate parts including nuts, bolts, cables and rivets.
“The design had to feel utilitarian so we were careful not to add superfluous detail,” explains Mike Bell. “Every component had a purpose. The rotary motor housings in the shoulders, pneumatic pistons and air hoses driving the legs, shock absorbers damping the feet, internal fans and vents cooling the body compartment.”
The DNEG crew knew they were going to be getting very close to the robot so the texturing and look development stage was very important. They developed levels of damage ranging from surface scratches and dents, all the way to almost complete destruction. “In fact, all the texturing was painted bespoke and nothing procedural was used,” he adds.
As far as animation was concerned, the differences between animation for a robot dog and animation for an organic character was a real challenge. “More often than not when working with CG characters, our animators will use real world animals as a reference or at least base it’s movement on something living,” explains Bell.
“This was totally different! We studied hours of videos from Boston Dynamics & Festo Robotics. What was strange was when you looked at the way those robots moved it sometimes looked like bad animation! There were little micro movements that you wouldn’t normally notice. Little pops and twitches as it’s adjusting its balance or going into a different mode. In the end it was those moments that really made it work and made it creepy.”
With such a quick turnaround and quite a high shot count, Mike Bell knew they needed ways to streamline the animation process. The DNEG animation team developed generic walk and run cycles along with a standing up mode and a relaxed dormant state. “We also used an in-house animation path tool for our first pass animation,” Bell explains. “It meant that we could draw a path along the LIDAR scanned terrain and the robot would follow that path from A to B. Its feet would stick to the ground and it would move over lumps and bumps, rocks, etc. This meant we didn’t waste any time on the early animation blocking stage and that we could do multiple iterations very quickly. Our animators could then go in, clean it up and add all the shot to shot bespoke animation layers that really made it pop.”
During the shoot, the DNEG crew were very thorough getting their on-set reference data. They took many 360-degree HDRIs of every location. “We took new ones every hour as the lighting was changing so much especially in the scene in the woods,” said Bell. “We also LiDAR scanned every single location, so with a combination of those meant that our CG could be lit very accurately.”
From a creative perspective, black and white presented a bit more of a lighting challenge. With no colour cues to help separate the robot from the background, DNEG had to focus on the use of light and shade to frame the action and draw attention to the story beats of a shot. “We tried to make the black and white workflow as seamless as possible for the lighters,” said Bell. “HDRIs were captured in colour as usual but then a monochrome LUT was automatically applied in our lighting software to match the CG to the black and white background plates.”
One issue that came up was due to the very narrow shutter angle the episode was shot with. There was very little motion blur on the plates which looked great. This really added to the visceral energy of the episode. But rendering the robot without it and comping it in the CG really made it look like stop motion! “In the end, we cheated a motion blur into the CG and even though it didn’t match the plate, it was much more pleasing to the eye.”
“Overall it was a really exciting project to work on,” added Bell. “It’s quite rare to work on such a unique show and get to do such original and unique work! I’m incredibly proud of what we achieved and to be part of such an amazing team of artists and production crew at DNEG!”