next up previous
Next: AR personality components for Up: Affective Reasoner personality models Previous: The Affective Reasoning Platform

The argument for adding complexity to the interface

Because adding complexity to a system always has an inherently negative impact (e.g., bugs, efficiency, comprehensibility, difficulty with upgrades and maintenance), and because added complexity in the user interface is particularly suspect, it must be clear that there is a payoff for these known costs.

The general motivation for this work is that effective teachers have many techniques they use for engaging students in pedagogical activities, assessing their participation level, varying presentation techniques, placing information in the form of stories, and so forth. Through the use of socially, and emotionally, intelligent reasoning components it may be possible for automated tutoring systems to make some use of a subset of these techniques formerly associated only with human teachers. Among other issues, this touches on the following ideas:

  1. Tutoring systems should maximally engage the user. Believable agents research, in which agent personality, and social responsiveness has long been an acknowledged goal, has maintained that engaging the user is a big win, and one plausible way to do this is by interacting with users' natural social tendencies. In addition, systems that can understand something about the user's affective state, can make better ``listeners'' in the broad sense, a useful tool in engagement. The clear expression of even minimal understanding is in itself an end, even if no action is taken on the basis of this knowledge, and as long as the real limitations of the agent are clear.gif
  2. The agent should foster enthusiasm in the subject domain.

    In a collaborative environment this would seem difficult to achieve if the cooperating autonomous agent were not itself believably enthusiastic about the subject. Enthusiasm is tied to human emotion, and is best represented through structures that understand the emotions that precede it.

  3. Which social-pedagogical teaching techniques can be translated into the automated tutoring paradigm? Successful human teachers use an array of techniques, many of them generally applicable to many, varied, domains. Some of these may well translate into the automated tutoring paradigm. Applications which allow us to explore, and measure, this will be useful to the field as a whole.
  4. Motivation is a key ingredient in learning. Emotion plays an important role in motivation. A computer tutor that is sensitive to a student's feelings of, e.g., pride and frustration, and appears to care about the student's progress, is more likely to motivate that student.
  5. Acknowledgment of progress. Agents who are capable of being enthusiastic about a student's progress in a domain, may help to give a student the impression that they really do care about how well the student is progressing on the tasks at hand. Simple acknowledgment for domain tasks achieved, and the perceived tutor emotional responses of joy, pride in the student, and so forth, may well create an environment of collaboration that fosters enthusiasm for the subject itself.
  6. Affective User Modeling. That a faithful affective user modeling capability would be useful is almost a truism, and needs little discussion. The form it would take is worth examining. Many common teaching guidelines, such as ``Ascertain where the students are in the domain, and start there,'' have long been considered as dependent on solving the very difficult problems of general user modeling. In our work, we instead proceed from the premise that progress on the much smaller problem of building a sophisticated, dynamic, representation of the user's affective state will allow us to make inferences that obviate many of the larger, domain-knowledge intensive, issues yet to be successfully addressed by AI. Additionally, students may well be motivated to explain how they feel to a tutoring agent that shows some understanding of what they are saying.


next up previous
Next: AR personality components for Up: Affective Reasoner personality models Previous: The Affective Reasoning Platform

Clark Elliott
Wed Dec 17 18:41:50 EST 1997