Right now, every industry faces discussions about how artificial intelligence might help or hinder work. In movies, creators are concerned that their work might be stolen to train AI replacements, their future jobs might be taken by machines, or even that the entire process of filmmaking could become fully automated, removing the need for everything from directors to actors to everybody behind the scenes.
But “AI” is far more complicated than ChatGPT and Sora, the kinds of publicly accessible tools that crop up on social media. For visual effects artists, like those at Wētā FX who worked on Kingdom of the Planet of the Apes, machine learning can be just another powerful tool in an artistic arsenal, used to make movies bigger and better-looking than before. Kingdom visual effects supervisor Erik Winquist sat down with Polygon ahead of the movie’s release and discussed the ways AI tools were key to making the movie, and how the limitations on those tools still make the human element key to the process.
For the making of Kingdom of the Planet of the Apes, Winquist says some of the most important machine-learning tools were called “solvers.”
“A solver, essentially, is just taking a bunch of data — whether that’s the dots on an actor’s face [or] on their mocap suit — and running an algorithm,” Winquist explains. “[It’s] trying to find the least amount of error, essentially trying to match up where those points are in 3D space, to a joint on the actor’s body, their puppet’s body, let’s say. Or in the case of a simulation, a solver is essentially taking where every single point — in the water sim, say — was in the previous frame, looking at its velocity, and saying, ‘Oh, therefore it should be here [in the next frame],’ and applying physics every step of the way.”
For the faces of Kingdom’s many ape characters, Winquist says the solvers might manipulate digital ape models to roughly match the actors’ mouth shapes and lip-synching, giving the faces the vague creases and wrinkles you might expect to form with each word. (Winquist says Wētā originally developed this technology to map Josh Brolin’s Thanos performance onto a digital model in the Avengers movies.) After a solver works its magic, the Wētā artists get to work on the hard part: taking the images the solver started, and polishing them so they look perfect. This is, for Winquist, where the real artistry comes in.
[embedded content]
“It meant that our facial animators can use it as a stepping-stone, essentially, or a trampoline,” Winquist explains with a laugh. “So [they can] spend their time really polishing and looking for any places where the solver was doing something on an ape face that didn’t really convey what the actor was doing.”
Instead of having to painstakingly create every single lip-sync and facial tic, the artists focus their time on crafting the depths of emotional nuance that the solver couldn’t handle. This lets them do more careful and detailed work than might have been possible when those earlier stages had to be done by hand.
While most of the AI tools that have garnered concern online are trained by pulling thousands of pieces of art posted on the internet by artists who haven’t given their permission for that use — and subsuming their styles and elements in order to build its vocabulary — Wētā’s tools are trained in-house, solely on the studio’s own work, according to Winquist.
“There’s so much gray area around copyright ownership, and ‘Where do they scrape all this information from?’ Right?” Winquist says. “The places where we’re using machine-learning in our pipeline, the algorithm essentially is being trained on our information. It’s being trained on our actors, it’s being trained on the imagery that we’re feeding it, that we’ve generated. Not imagery from wherever.”
The solver tools, which Winquist and his team built and refined on each movie Wētā has worked on, enable the studio to take on more ambitious projects and scenes than they could have in the past. For instance, the massive set piece at the climax of Kingdom of the Planet of the Apes is a scene Winquist isn’t sure they could have done without the team’s newest generation of water-solver software. Those tools, unsurprisingly, were refined during production on Avatar: The Way of Water. Winquist notes that the work was a step up from the water scenes in War for the Planet of the Apes, the previous film in the franchise.
“We would have struggled,” Winquist says. “I would say if we had not done those previous films, there would have been a big push in R&D to get us up to scratch.”
Winquist goes on to describe the incredible complications of Kingdom of the Planet of the Apes’ various water scenes: One takes place on a bridge over a raging river, and another involves a massive flood of ocean water. According to Winquist, Way of Water’s water solver was key to getting these scenes off the ground, because it allowed the effects artists to simulate how the water would respond to certain elements, like the ape characters and their hairy bodies, without fully rendering those scenes in a computer.
This allowed art directors a chance to tweak the scene before the rendering took place, making quick adjustments possible. Previously, computers might have needed nearly three days to fully render all the CG details on a scene before the effects team could watch the results, tweak the algorithms, then start the process over again. That made this kind of CG scene virtually impossible to refine in the way the Wētā team could with this movie.
But for all the ways AI technology proved essential on Kingdom, Winquist still sees it as nothing more than an empty tool without the artists who guide it and pin down the finished product.
“I don’t want to overshadow the human element,” Winquist says. “I’ve seen approaches where the facility has really gone in deep on the machine-learning side of things, and it feels like that’s where it stopped. I see the result of that, and I just don’t believe that voice coming out of that face or whatever. The thing I find we’re so successful with is that the artists are ultimately driving [the finished scenes], using the machine-learning stuff as a tool, instead of the answer to the problem.”
For Winquist, that human element will always be key to producing something great and artistically interesting.
“Ultimately, [machine-learning tools] can only regurgitate what they’ve been fed,” he says. “I don’t know. Maybe I’m a Luddite, but I just don’t know if there’s ever a point where that stuff can meaningfully make [a movie] truly engaging — a piece of art that somebody is going to want to actually sit down and watch. The thing I always go back to is, Why should I bother to read something that somebody couldn’t be bothered to write? Whether it’s words or whether it’s images.”
Kingdom of the Planet of the Apes debuts in theaters on May 10.