When it comes trying to generate a photoreal digital crowd, the human eye is incredibly picky, and getting realistic crowd behaviors can be a real challenge. And it doesnít really matter if itís a crowd of people, a flock of birds, a swarm of ants or an assemblage of anthropomorphic aliens, viewers will focus in on the tiniest of errors ó any movement that doesnít seem natural, a character that doesnít seem to walk with enough weight or a group whose movements are just a little too uniform.
"The interesting thing that weíve noticed in crowd animations and simulations is that you can have 300-400 different characters of different sizes and proportions, moving completely different, but if you have one character that is standing there and not moving, the eye immediately goes to it," said Jason Barlow, lead character animator, RIOT Santa Monica. "Itís the strangest thing. And projected on film, at a larger resolution than what we see on our monitor in the office, itís three times worse."
[an error occurred while processing this directive]
Of course, if you need a giant battle scene, thereís always the option of hiring hundreds of extras, but that can get expensive, and when prosthetics and costumes are required, it can become almost impossible.
Another technique is to shoot smaller crowds and composite them together or augment them with digital extras. For the upcoming Atom Egoyan Film, Ararat, Toronto-based Mr. X was called on to create an army by replicating out and creating computer-generated soldiers. "We turned 200 extras in full Turkish regalia into an army of 5,000. It was a very technical setup. We shot the landscape on the large format film stock VistaVision ó the same technique used in the beginning of Gladiator ó shot the soldiers in quadrants, moved them around and composited them together with our digitally created combatants," explained Mr. X president Dennis Berardi.
Particle systems are another common technique. They are great at flocking-type behaviors, but interaction between different types of creatures with different types of behaviors, and in particular, battle scenes can be a real challenge.
However, in the video game world, the idea of an army of humans battling an army of aliens isnít all that daunting. Whatís more, for games, it always has to be done in real time. Hence, there is increasing interest in the use of video game-like artificial intelligence tools to control crowd behavior. These tools enable artists to run simulations where characters behave autonomously rather than the laborious task of keyframing. The idea is that each character in a crowd can have its own Ďbrainí ó a set of basic instructions that tell the creature how to behave under different circumstances, drawing from a library of motions for walking, running, fighting, etc, and blending them together.
Wellington, NZ-based Weta is largely seen as a pioneer in this approach to generating crowds for The Lord of the Rings trilogy. To tackle the huge battle scenes required for the films, the company began development of its own proprietary system called Massive over four years ago, which creator Stephen Regelous calls, "a tool for the creation of artificial ecologies."
Each creature is programmed with a range of behaviors which draw from a huge database of motion capture data. A motion-blending engine within Massive is used to merge motions together.
"What we are seeing is pretty realistic looking battle action, which I donít think we would have achieved if we had taken a particle approach," said Regelous. "So building the system up from the ground is what we needed to be able to do to tackle those problems."
But Jon Labrie, who served as chief technology officer at Weta until recently, cautioned that, "We still have a ways to go. It turns out that its at least as difficult to control an intelligent autonomous agent as it is to control a real actor. They donít necessarily do what you want them to do. So tools for choreography are just as important as anything else. And to some extent, the tools that you use for choreography are the same tools that you use for particles. So you find yourself falling back on some of the more traditional methodologies.
"Of course what youíre hoping for, and what weíre seeing to a large extent, is emerging behavior. But particle is inexpensive computationally and otherwise. And because particles are very directable, I think they will remain compelling for a long time to come," he added.
A similar system, called Armada, is in use at RIOT Santa Monica developed in conjunction with Softimage Special Projects. RIOT first began to struggle with the problems of behavioral animation when the company was confronted with a large battle scene for the climax of The One.
"It was going to require a large prison compound set on another planet, which meant a vast scale," said Barlow. "At the same time it also meant an environment, which really couldnít be realized through an actual set on a stage somewhere. They had originally planned to use a rock quarry up north of Los Angeles, but we immediately saw a lot of problems in terms of how do you control the elements like wind, a lot of dirt, etc, in terms of putting in tracking markers."
At that point, Barlow came across Softimage RTK (runtime tool kit), which is based on the Intelligent Digital Actor technology from the Motion Factory, which Avid acquired in 2000. Originally developed as an engine for control the behavior of characters in video games, the company was beginning to explore its use on feature films.
"Previously in game engines you werenít able to run a very detailed character. It had to be a very simple polygon moving across the screen," said Barlow. "What we saw from Avid and Softimage really caught my eye. They came in with a laptop and presented this gaming engine, which runs in real time. And Iím like Ďthatís great, but letís talk about film technology here. If I can have a simulation as a previsualization tool, as well as a tool that will give me a string of numbers or output an animation file, now weíre talking.í"
RIOT then enlisted the aid of Softimage Special Projects to build Armada on top of the RTK engine. Armada has since been used to generate crowds for The One, The Scorpion King, Queen of the Dammed, as well as a Benadryl spot. While Armada is a proprietary in-house pipeline at RIOT, Softimage will be unveiling the underlying tool set at Siggraph as part of Softimage XSI version 3.0. The packaging (and the name of the system) had not been determined at press time.
"Behavioral animation is becoming much more important. Hiring thousands and thousands of extras is just no longer economically viable, and itís a lot easier to manage in a computer, than it is to manage a bunch of people on a huge set. And compositing bits of crowds that youíve filmed is not really practical either anymore," said Michael Smith, technical product manager, XSI.
Smith explained that RIOT had served as a beta site for the system. Piedmount, Quebec-based Hybride Technologies is another beta site, and used the RTK engine to create digital armies for the upcoming miniseries, Napoleon.
Differentiating Armada from the crowd animation tool set that Softimage will released at Siggraph, Smith explained, "What we gave RIOT was the real time library, which is called RTK, and the tools to integrate that library into their pipeline ó .xsi file formats, file I/O libraries, some SDK tools, scripts and that kind of thing.
"RIOT was able to go ahead and build that into their particular flavor that they call Armada. Weíve been taking the feedback and building that into the artistís tool. This will be our first iteration at Siggraph," Smith added.
1 2 3 Next
[an error occurred while processing this directive]