Game Development Reference
This creates two critical concerns for AI programmers. The first concern is
dealing with their game designer. Much like Dr. Frankenstein, a game designer
who demands complete and total control over AI behavior will not gracefully
deal with an AI gifted with all of the controllability of a herd of cats. If the
designer's task is to achieve a very specific entertainment experience, he or she
may not be able to realize it with these methods. Designers with more leeway
bring a more daunting challenge to the AI programmer—hence the AI pro-
grammer's second concern. When this more flexible designer says, ''That concept
is too cool to leave out. Put it in and we'll design around it if we have to,'' the AI
programmer is committed to making it happen. This second concern cannot be
overemphasized. Any novel application seeking emergent behavior is a high-risk
endeavor. Early prototyping and proof-of-concept work is mandatory. Early
winemakers knew that grape juice usually turned itself into wine if they left it
alone, but they also knew that sometimes it just went bad.
Fortunately, there are some guiding principles worth examining when there is no
recipe. Start with the interactions of simple behaviors, searching for the poten-
tially narrow zone between no results and an unstable system. As part of the
search, you may need to carefully explore the interactions not only for balance
but also for the right timing.
New systems that resemble existing systems are likely to show similar emergent
behavior. Tanks and birds have substantial differences, but tank platoons and
bird flocks can benefit from very similar code [VanVerth00]. Steering behaviors
for groups of individuals are the poster child for emergent behavior. Besides
keeping a group in formation, steering behaviors also excel at obstacle avoidance.
Variations on this theme rarely destroy the desired emergent behavior. Failures
in behavior are possible, but they tend to be moderately benign and reasonable.
Car drivers caught in exit-only lanes are forced to leave the freeway when they do
not want to. Game AI that makes mistakes that leave the player thinking, ''That
could have happened to me ...'' are more well regarded than AI that makes more
unfathomable errors. Not all failures are benign; agents can get stuck, run in
circles, or even into walls.
The problem of getting good emergent behavior is harder when the issue at hand
does not relate to movement. In computer science, a classic method of attack is to