As reported on The Verge.
By Bryan Bishop
Translating the sci-fi classic Ender’s Game to the big screen required tackling iconic locations and sequences in a way that honored fan expectations while still playing to audiences that were new to the material. To accomplish that goal, writer-director Gavin Hood joined forces with Digital Domain, the visual-effects powerhouse originally founded by James Cameron and Stan Winston.
In the years since its inception, Digital Domain has worked on a wide array of projects, handling everything fromTitanic to the remarkable reverse-aging Brad Pitt in The Curious Case of Benjamin Button. But when it came toEnder’s Game no single location proved to be more of a complex challenge than the infamous Battle Room.
Concept art
A zero-g environment in which Ender and his fellow cadets play mock war games, the Battle Room in the novel is nothing more than an enormous black cavern. Hood felt the lack of flair wouldn’t play as well in a film, so he opted for something with, in his words, “more visual pop”: a giant, geodesic glass sphere, with Earth and the stars waiting just outside its protective walls. The benefit for theatrical audiences, Hood says, was obvious. “What would it be like to actually jump out into space, without having to wear a spacesuit? And see, almost, what you would see if you were literally floating in space?”
With its vast size and breathtaking vistas, every shot in the Battle Room became a task for the visual effects team, accounting for around a third of the film’s 941 visual effects shots. Equally important to creating the environment was selling the illusion of weightlessness as Ender (Asa Butterfield) and his schoolmates float, glide, and artfully drift in the zero-g environment.
Visual effects supervisor Matthew Butler (Transformers: Dark of the Moon) met with stunt coordinator Garrett Warren early on to determine how they could get as close as possible to capturing the look during principal photography itself. The Battle Room sequences were shot on a massive green-screen stage built in a NASA facility in New Orleans. The actors were hung from wire rigs; by working in concert with the wire operators, they were able to roughly approximate zero-gravity physics. That said, Butler and Warren both knew it wouldn’t be enough on its own.
AUTOMATED SOFTWARE CONFORMED THE MOVEMENT TO ZERO-G PHYSICS
“In zero-gravity your center of mass can only either move in a straight line or be still, unless there’s a force acting on it,” Butler explains, whereas on Earth we’re always fighting against gravity — even if we’re hanging from wire rigs. To deal with the problem, a member of the visual effects team would take a given shot and hand-animate keyframes, replicating the movement, body positioning, and performance of the actor with a virtual computer-generated puppet. With that element in hand, they were able to study the movement and determine how it betrayed authentic zero-g behavior.
Digital Domain could then turn to an automated software-based solution, Butler says. “We built tools that, assuming certain densities of the body, could compute the ever-changing center of mass based on summing up all the different parts of the limbs and their positions in three-dimensional space.” The tools would then correct the motion of the virtual puppet to fall in line with what real-world weightlessness would dictate. The live-action elements were then mapped onto the motion-corrected puppet. “We always tried to get as close as possible, but often we would have to only keep the facial performance of the photography, and then create everything else in the computer,” he says.
ANIMATED CG FIGURES GOT THE SAME TREATMENT
Scenes often incorporated purely digital characters in the background, and the motion-correction technique proved so effective Digital Domain even used it on those animated figures. “If you give a very talented artist the task of, ‘Okay, have this guy flail around like a crazy person,’ they’ll do a great job — but they are not constrained by any physics here” he says. Correcting the center of mass for the animated figures gave them the best of both worlds: the creativity of the animator and the realism provided by the software tools.
Modeling real-world physics is often preferable to animation, Butler explains, because people are so familiar with the nuances of motion. “I believe it’s a survival technique,” he says. “Back in the day, when there’s a lion behind you, you pick up on motions very quickly. You can spot the gait of a person that you know” — he snaps his fingers — ”very quickly. We’re just tuned that way.”
“When we see a ball drop and bounce, we know what’s quote-unquote ‘right.’ And if it doesn’t behave the way physics describes, we’ll go, ‘That’s wrong.’” The concept manifests itself in multiple ways throughout Ender’s Game, and the industry at large. When a Formic ship is destroyed in a Simulation Cave sequence, the explosion isn’t animated; it’s simulation software modeling the physics of an explosion. “As a result you see the overwhelming splendor of something that, to you, just looks right — and you go, ‘wow.’ So we always lean on that.”
“YOU NEED PEOPLE THAT ARE A COMBINATION OF ART AND SCIENCE.”
But despite the allure of simulations and computer-calculated physics, Butler says it still takes an artistic hand to guide the process. When it came to the movement of the Formic fleet, for example, Hood wanted to mimic the swarm-like murmurations of migrating starlings. “You don’t go out and buy a flocking pattern software that does organic flocking for Formics, you know?” Butler laughs. “You buy some software that would allow you to train it to do that. And so that’s why you need to have that technical capability to be able to steer the application specifically for what you want, but you also need the artistic backbone to be able to make sure that it looks beautiful at the end of the day. Which is why you need people that are a combination of art and science.”
Ender’s Game is now playing in theaters.