Can Robots Be Emotional? Meet AIBO

Story posted October 10, 2002

He's made of metal and flashing lights, and he whirrs when he walks, but AIBO, a robotic dog, had the audience oohing and awwwing as though he were a real puppy at the Faculty Seminar on Wednesday.

Eric Chown, assistant professor of computer science, had come to talk about emotional computers and artificial intelligence, but first introductions were in order. Chown told the audience that AIBO was a little angry, because he wanted attention. (But he wasn't going to get it, since Chown had to talk.) But Chown advised that if AIBO happened to walk by a table during the talk, the appropriate response would be to stroke his ear backward, which he likes.

Chown has been studying how emotion affects cognition (and how cognition affects emotion), and how that comes into play when working with artificial intelligence. Chown is among the AI researchers who see human intelligence as a useful framework in which to examine artificial intelligence, so it's not surprising that the issue of emotion has come up.

The popular consciousness pits emotion against intellect. (Witness the character Data from Star Trek: Next Generation, Chown said. Data is supremely intelligent, but finds emotions a mystery.) Many people, including two leading computer scientists Chown recently heard at a conference, believe that emotions impair reality and hamper decision making - and they believe that robots could never be emotional. Chown has a different point of view.

"Humans rely on information because we don't have sharp claws and big teeth," he said.

Eventually, for computers to go where humans can't go and do things humans can't, or don't want, to do, they need to be realistic. And they need to react, to some extent, as humans would.

"Emotions are an essential part of what makes us, us," Chown said.

In situations when we might be in danger, we need to be able to make split-second decisions. "One important way in which we use our intelligence is emotions," Chown said. "Rational thought is not an option when encountering a lion."

There are several important questions humans must ask when sizing up a situation: How important is the situation (to develop priorities)? Is it good or bad for me? Can I handle it adequately?

"My thesis is, the emotions system provides fast answers to all three of those," Chown said.

Chown divides the emotions into three basic categories: arousal, pleasure/pain, and clarity/confusion.

Arousal, the level of excitement or agitation one feels, indicates how important a situation is. Pleasure or pain indicates how good or bad a situation is. And the level of clarity or confusion predicts how competent one is likely to be in handling the situation.

The most basic level of emotion is arousal, and even simple organisms have arousal systems. The level of arousal determines how much attention we pay to something. (To get an idea of what arouses people, Chown said, just watch television - bright colors, movement, people, loud noises all result in arousal.)

Once someone is aroused, the sensation of pleasure or pain gives more information and enhances the ability to evaluate the situation. For example, pain usually signifies damage, so feeling pain when you put your hand on the stove will cause you to pull it away.

Clarity and confusion are even more sophisticated than pleasure and pain. It has to do with the whether one's internal model matches or contradicts external reality.

To make a computer emotional, Chown said, it needs to be able to categorize input (as arousal, pain etc.), to be able to index how arousal relates to knowledge (when arousal is high, knowledge is restricted), and to act in a way that maximizes pleasure and minimizes pain.

When AIBO hears his name or sees his pink rubber ball, he is aroused.

"He thinks his ball is pretty cool," Chown said.

AIBO experiences pleasure when his ear is rubbed backward or his whiskers are stroked. He experiences pain when his ear is rubbed forward. He can recognize expressions such as "good boy" and "bad dog," which serve to clarify or confuse. (Chown said that this is an approximation, since Sony won't actually tell him whether there is truly a state of clarity or confusion that AIBO can experience. But when Chown and his student collaborators reprogram AIBO, there will be.)

"Does this AIBO have emotions?" Chown asked. "Well, my argument is he's doing the exact same thing that we do."

AIBO interprets stimuli, and those interpretations affect his behavior in ways comparable to humans. His experiences also allow him to further refine his emotional response. And his behavior is also hard to predict, just as it would be with any emotional creature.

Of course, no matter how good AIBO, or any other computer, is at interpreting stimuli, he will only be as good as his sensors. Developing better sensors (such as lasers and cameras) is a huge part of AI research. One of humans' most distinguishing characteristics is our ability to distinguish what we hear and see. "That's the hardest thing that we do," Chown said, "recognizing people and objects."

Regardless of how emotional AIBO is, it was clear he struck an emotional chord with his audience. Many people gathered around to watch him and pet him after the seminar.

"I'm a computer scientist," Chown said, "and I have a hard time doing what I just did - ignoring him."

« Back | Campus News | Academic Spotlight | | Subscribe to Bowdoin News by Email