The Los Angeles Times lands on the doorstep (and the desktops) of the film industry and one of the paper’s ongoing features is Working Hollywood, which delves into the workaday life of the movie community. Here’s one of those features, which is landing here at the Hero Complex due to the film in question — Tim Burton’s “Alice in Wonderland,” a movie we’ve covered intensely.
When Steve Chapmanwas growing up in San Jose, he used to build his own “Star Wars”-inspired toys out of plastic containers. Now he’s part of a team that creates digital models for films such as “Alice in Wonderland.”
“What we do is cyber-scan the actors, the sets, the cars and whatever else needs to become a computer model for the visual effects,” Chapman said. “Usually, this involves lasers.”
Chapman made the leap from plastic jugs to lasers by way of a degree in industrial design from UCLA. When he graduated in the early 1990s, cyber-scanning was still a twinkle in its inventor’s eye, so he took a job with a Japanese company that did industrial design for everything from car radios to computers.
“Once I got to the point where I felt confident that I could draw any kind of computer part or car part, I began to think about what would be the next level, just to advance myself,” he said. “I came to the conclusion that it was to draw organic things like human beings or animals in the computer.”
So in 1998, Chapman joined Gentle Giant Studios, which has done effects for film franchises including “Star Wars,” “Harry Potter,” “Lord of the Rings” and “Pirates of the Caribbean.”
“I bet you could not name a movie that we weren’t involved with actually,” he joked. “Anything where you’re going to have an actor who has to be also a virtual actor at some point in the movie, they’ll call us up.”
Six Impossible Things Before Breakfast
One scanner does not fit all. “We use about a dozen different scanning systems that will shoot some sort of light at an object and take measurements to make an X, Y, Z version of it,” said Chapman. “Some of them use triangulation. The ones that do the larger things like pirate ships and movie sets will actually shoot out laser pulses and then measure the speed of them by counting how long it takes for them to reflect back. But no one scanner can do everything. So I can’t scan Johnny Depp with a device and then turn around and scan his pirate ship with the same device.”
The computers that process this information can’t be bought off the shelf in the Mac store. “Light is traveling at 186,000 miles per second, so it wasn’t until very recently that computers were able to keep up with that and actually measure something that quickly,” said Chapman. “The computers we use cost in the hundreds of thousands of dollars. Most of them have been developed for things like, for example, if the oil industry needs to measure a refinery to make a CAD [computer-aided design] model to fit pipes in, or actually measure the ground if they’re going to be drilling. So we just co-opted all of these different, weird technologies and applied them to making movies.”
Reeling and Writhing
When scanning actors’ faces for “Alice in Wonderland,” Chapman and his colleague Brian Sunderlin were capturing more than just their physical features. “The key thing we provided for ‘Alice’ was the ability to have the actor become the computer model and actually have the person’s features and performance built into the computer animation,” explained Chapman. “So when the character’s doing something that the physical actor can’t possibly do — such as transmogrifying or morphing into another shape — it’ll still be the actor making that happen underneath all the 3D visual effects. [The key is] facial action coding, a system that was developed by Paul Ekman. He’s the guy who’s the basis for the character in the TV show ‘Lie to Me.’ He has a system for figuring out how humans communicate beyond just with their voices, and how their facial expressions and muscles will react when they’re feeling something or trying to communicate or express something. So in many cases with a movie, we’ll actually do what’s called a coding session, where we go through all of the different possible iterations of expression and record how that actor will do them. There are actually hundreds of them, but we narrowed them down to a few dozen that are critical to most performances. It can get tedious after a while, but the actor will sit there and do certain lines that are meant to emote certain feelings — like joy or contentment or anger — and we’ll record that in 3D.”
Off With Her Head!
The Queen of Hearts, played by Helena Bonham Carter, has a big head — literally. “They’ve gone really wild with making the entire shape of the actor’s head not human,” said Chapman. “And that’s a cross between using the actor to do the performance and having the computer modify her head into the new shape. But I don’t want to make it sound like a clinical, technical process. There really is a lot of the actor and her art involved.”
The Eighth Square
On “Alice in Wonderland,” Chapman met the inspiration for the homemade toys of his childhood. “A lot of the people I work with were influenced by the ‘Star Wars’ movies and the special effects in those, which is interesting because when we scanned Johnny Depp for ‘Alice in Wonderland,’ it was supervised by Ken Ralston, who was the guy who filmed all those models for George Lucasback in the 1970s,” said Chapman. “You really feel like you’re in the presence of a true pioneer, because a guy like Ken Ralston didn’t have computers and scanners. He just kit-bashed it all together.”
— Cristy Lytal
RECENT AND RELATED: