DePaul researcher programs emotions into computer

BY HOWARD WOLINSKY Technology Reporter

Photo of Clark and Sam

When Clark Elliott's computer is down, it's really down.

When it's up, it's happy as a lark.

Elliott, a computer-science researcher at the Institute for Applied Artificial Intelligence at DePaul University, is pioneering ``emotionally intelligent agents,'' user-friendly programs aimed at teaching computers to recognize, understand and respond to human emotions.

What he has in mind is a pal in a Pentium, rather than the murderous HAL computer in the movie ``2001.''

Traditionally, computers have been viewed as cold, dispassionate hunks of hardware.

But in Elliott's world, computer agents express two dozen emotions and hundreds of shadings, a full spectrum of love, hate, fear, anger, hope, pride, shame and gloating. His model also takes into account the intensity of emotions and moods.

Over the last seven years, Elliott has created the Affective Reasoner, a group of artificial-intelligence programs that cannot understand natural language but can reason about human emotions.

Under the Affective Reasoner, computer agents have individual personalities with dispositions that control their world view and a temperament that controls how they express their emotions.

The multimedia agents are ``embodied'' on computer monitors by animated cartoon faces--line drawings resembling smiley buttons.

Elliott tests his agents by conducting conversations with them and even having them talk to each other. The agents speak with mechanized voices through computer speakers and listen through microphones.

``The agents run in real time: If you talk to them, they talk back,'' he said.

The agents respond to emotional language and choose their own faces, inflections and even musical accompaniment. A professional musician, Elliott says music--such as a strain from the Beatles' ``Yesterday'' to represent sadness or Beethoven's Fifth Symphony for anger--conveys a great deal about emotion.

In one demonstration, the agents ``Sam'' and ``Joe'' react to each other. At one point, Joe picked up on Sam being pleased at his expression of sadness. Joe said: ``You don't have to gloat about it!''

In a simulated ``bar'' scene, Chicago Bulls fans represented by two agents are happy their team has ``killed'' and ``wiped the floor'' with the New York Knicks, while Knicks' fans become distressed and angry and then decide they want to leave the bar.

Elliott said these demonstrations show how computer agents can be ``virtual actors'' for story-telling for education or entertainment.

Cognitive scientist Andrew Ortony, a leading theorist on emotions and computer modeling at Northwestern University, said Elliott's work is important because computers will become more useful if they can better understand the social and emotional components of the human world in which they operate.

Ortony, on whose models Elliott based his agents, stressed that computers will not themselves be able to experience emotion. ``No biological body, no emotions,'' he said, noting that emotions are a biochemical phenomenon.

Ortony compared emotionally intelligent agents to Mr. Spock, the Vulcan science officer on the sci-fi series ``Star Trek,'' who strives to understand the human emotions he can't experience.

A variety of applications are envisioned for emotional agents.

At the top on the list for researchers is adding emotional agents to tutoring programs, customizing them to students' different learning styles.

Computer scientist James Lester, director of the IntelliMedia Initiative at North Carolina State University, said his research has showed that students form a ``bond'' with their agents.

``The more expressive an agent is, the more that students find them credible and helpful,'' he said.

Researcher Jeff Rickel of the University of Southern California Information Sciences Institute, said, ``Motivation is a key ingredient in learning, and emotion plays an important role in motivation. A computer tutor that is sensitive to a student's feelings of pride and frustration and appears to care about the student's progress is more likely to motivate the student.''

Elliott said emotional agents could be used to train health professionals to be more compassionate, teach salesmen how to interact with different personality types and to make richer characters for use in computer-based games, story-telling and entertainment.

Ortony, a professor of computer science, education and psychology, said agents could be useful in testing psychological theories.

He said that truly empathic ``machine partners'' won't be available ``for 20 years at the least. It won't happen tomorrow.''

He said emotionally literate robots someday might join humans working in hostile surroundings, such as astronauts on Mars. ``It would help a machine to understand what happens to a person in those environments,'' he said.

* * * * * * * * * *

As part of his research, DePaul University computer scientist Clark Elliott has conversations with ``emotionally intelligent'' agents. The agents do not understand natural language but are programmed to understand and respond to emotions. The agent chooses its own cartoon face, response and accompanying music based on the emotion involved and its intensity.

Following is an edited dialogue, with annotations in italics from Elliott, between the researcher and one of his agents, ``Sam.''

Clark: I am really furious at Joe.

Sam: Boy, what a blameworthy thing he must have done! What do you call such an action?

(Sam believes fury to be an ANGER-type emotion comprised of a blameworthy action leading to a blocked goal. He tests this hypothesis by asking about an action.)

Clark: Joe was stealing.

Sam: When Joe was stealing, Joe must have caused you distress. What is the name of the thing that you wanted, or wanted to preserve?

(Having confirmed the action component, Sam now investigates the blocked goal.)

Clark: My sandwich.

Sam: How important was your sandwich to you, Clark?

(Sam believes FURY is a relatively intense version of the ANGER and now tries to explain it. Importance of a goal is an emotional intensity Sam uses for reasoning.)

Sam: How surprising was it that Joe was stealing and caused you distress about your sandwich?

(Sam reasons that the more surprising an action, the more intense the resulting emotion will be.)

Clark: It was not surprising.

Sam: How bad was it that Joe was stealing?

(Sam reasons that the more blameworthy an action, the more intense the emotional reaction to it.)

Clark: It was not very bad, actually.

Sam: How important is your principle about stealing?

(Importance of a principle is another intensity cue that Sam uses.)

Clark: Quite important.

Sam: Hmm. I would think that you are only somewhat annoyed at Joe. I do not understand that you are furious at Joe. After all, what Joe did was not that bad. Your sandwich is not that important to you. It is not very surprsing that Joe was stealing. Perhaps your principle about stealing is very, very important to you. Is this correct?

(Since most intensity variables have low values, Sam expects annoyance rather than fury. One possible explanation for this would be if the principle involved was very important to Clark.)

Clark: Yes.

Sam: I will remember that about you. (Later) . . . Clark, if I was stealing and it affected your sandwich, would you still be my friend?

(Sam has updated data about Clark and is seeking further information about his relationship with Clark if they were at odds about a strongly held principle.)

--HOWARD WOLINSKY

Back to Affective Reasoner Page