Digital Crowds
Game-style AI Controls Adapted For Behavioral Animation

Page 1 of 3

RIOT Pictures visual effects team created this image for Film & Video using its crowd replication system, dubbed Armada, which is built on a new event-based crowd engine from Softimage. Concept design by Kenneth Nakada and Jason Barlow integrates live action, CG architecture, matte painting, and effects animation. Click on image for larger view.
For films like The Lord of The Rings and Star Wars: Episode II Attack of the Clones, Wellington, NZ-based Weta and ILM respectively developed proprietary in-house systems to deal with crowd animation. But ILM and Weta are exceptions to the rule and not every effects house can afford the huge overhead of maintaining a crowd of computer programmers to support an in-house system. Now commercial tools from Softimage and Alias|Wavefront partner BioGraphic Technologies are becoming available that will give animators the ability to program behavioral animation in much the same way that characters in a video game are programmed — tools that may fundamentally change the way animators tackle digital crowd behavior.

When it comes trying to generate a photoreal digital crowd, the human eye is incredibly picky, and getting realistic crowd behaviors can be a real challenge. And it doesn’t really matter if it’s a crowd of people, a flock of birds, a swarm of ants or an assemblage of anthropomorphic aliens, viewers will focus in on the tiniest of errors — any movement that doesn’t seem natural, a character that doesn’t seem to walk with enough weight or a group whose movements are just a little too uniform.

"The interesting thing that we’ve noticed in crowd animations and simulations is that you can have 300-400 different characters of different sizes and proportions, moving completely different, but if you have one character that is standing there and not moving, the eye immediately goes to it," said Jason Barlow, lead character animator, RIOT Santa Monica. "It’s the strangest thing. And projected on film, at a larger resolution than what we see on our monitor in the office, it’s three times worse."
[an error occurred while processing this directive]
Of course, if you need a giant battle scene, there’s always the option of hiring hundreds of extras, but that can get expensive, and when prosthetics and costumes are required, it can become almost impossible.

Another technique is to shoot smaller crowds and composite them together or augment them with digital extras. For the upcoming Atom Egoyan Film, Ararat, Toronto-based Mr. X was called on to create an army by replicating out and creating computer-generated soldiers. "We turned 200 extras in full Turkish regalia into an army of 5,000. It was a very technical setup. We shot the landscape on the large format film stock VistaVision — the same technique used in the beginning of Gladiator — shot the soldiers in quadrants, moved them around and composited them together with our digitally created combatants," explained Mr. X president Dennis Berardi.

Particle systems are another common technique. They are great at flocking-type behaviors, but interaction between different types of creatures with different types of behaviors, and in particular, battle scenes can be a real challenge.

However, in the video game world, the idea of an army of humans battling an army of aliens isn’t all that daunting. What’s more, for games, it always has to be done in real time. Hence, there is increasing interest in the use of video game-like artificial intelligence tools to control crowd behavior. These tools enable artists to run simulations where characters behave autonomously rather than the laborious task of keyframing. The idea is that each character in a crowd can have its own ‘brain’ — a set of basic instructions that tell the creature how to behave under different circumstances, drawing from a library of motions for walking, running, fighting, etc, and blending them together.

Wellington, NZ-based Weta is largely seen as a pioneer in this approach to generating crowds for The Lord of the Rings trilogy. To tackle the huge battle scenes required for the films, the company began development of its own proprietary system called Massive over four years ago, which creator Stephen Regelous calls, "a tool for the creation of artificial ecologies."

Each creature is programmed with a range of behaviors which draw from a huge database of motion capture data. A motion-blending engine within Massive is used to merge motions together.

"What we are seeing is pretty realistic looking battle action, which I don’t think we would have achieved if we had taken a particle approach," said Regelous. "So building the system up from the ground is what we needed to be able to do to tackle those problems."

But Jon Labrie, who served as chief technology officer at Weta until recently, cautioned that, "We still have a ways to go. It turns out that its at least as difficult to control an intelligent autonomous agent as it is to control a real actor. They don’t necessarily do what you want them to do. So tools for choreography are just as important as anything else. And to some extent, the tools that you use for choreography are the same tools that you use for particles. So you find yourself falling back on some of the more traditional methodologies.

"Of course what you’re hoping for, and what we’re seeing to a large extent, is emerging behavior. But particle is inexpensive computationally and otherwise. And because particles are very directable, I think they will remain compelling for a long time to come," he added.

A similar system, called Armada, is in use at RIOT Santa Monica developed in conjunction with Softimage Special Projects. RIOT first began to struggle with the problems of behavioral animation when the company was confronted with a large battle scene for the climax of The One.

"It was going to require a large prison compound set on another planet, which meant a vast scale," said Barlow. "At the same time it also meant an environment, which really couldn’t be realized through an actual set on a stage somewhere. They had originally planned to use a rock quarry up north of Los Angeles, but we immediately saw a lot of problems in terms of how do you control the elements like wind, a lot of dirt, etc, in terms of putting in tracking markers."

At that point, Barlow came across Softimage RTK (runtime tool kit), which is based on the Intelligent Digital Actor technology from the Motion Factory, which Avid acquired in 2000. Originally developed as an engine for control the behavior of characters in video games, the company was beginning to explore its use on feature films.

"Previously in game engines you weren’t able to run a very detailed character. It had to be a very simple polygon moving across the screen," said Barlow. "What we saw from Avid and Softimage really caught my eye. They came in with a laptop and presented this gaming engine, which runs in real time. And I’m like ‘that’s great, but let’s talk about film technology here. If I can have a simulation as a previsualization tool, as well as a tool that will give me a string of numbers or output an animation file, now we’re talking.’"

RIOT then enlisted the aid of Softimage Special Projects to build Armada on top of the RTK engine. Armada has since been used to generate crowds for The One, The Scorpion King, Queen of the Dammed, as well as a Benadryl spot. While Armada is a proprietary in-house pipeline at RIOT, Softimage will be unveiling the underlying tool set at Siggraph as part of Softimage XSI version 3.0. The packaging (and the name of the system) had not been determined at press time.

"Behavioral animation is becoming much more important. Hiring thousands and thousands of extras is just no longer economically viable, and it’s a lot easier to manage in a computer, than it is to manage a bunch of people on a huge set. And compositing bits of crowds that you’ve filmed is not really practical either anymore," said Michael Smith, technical product manager, XSI.

Smith explained that RIOT had served as a beta site for the system. Piedmount, Quebec-based Hybride Technologies is another beta site, and used the RTK engine to create digital armies for the upcoming miniseries, Napoleon.

Differentiating Armada from the crowd animation tool set that Softimage will released at Siggraph, Smith explained, "What we gave RIOT was the real time library, which is called RTK, and the tools to integrate that library into their pipeline — .xsi file formats, file I/O libraries, some SDK tools, scripts and that kind of thing.

"RIOT was able to go ahead and build that into their particular flavor that they call Armada. We’ve been taking the feedback and building that into the artist’s tool. This will be our first iteration at Siggraph," Smith added.

Source: Film & Video

1 2 3 Next
Related sites: • Animation ArtistDigital AnimatorsDigital Post ProductionDigital ProducerFilm and Video MagazineHollywood Industry
Related forums:

[an error occurred while processing this directive]

Copyright © 2004 PBI Media, LLC. All rights reserved.

top      home      search      user forum      subscribe      media kit     contact      [email protected]