next up previous
Next: Other notes Up: Emotion antecedents in Steve Previous: Sample principles for various

Extended examples

Setting this into an emotion-theoretic context, let us suppose that Steve has the following three goals:

Goal: (Serendipity) Steve wants the user to be familiar with where the
dipstick is and how to remove it.

Simulator messages meet these constraints: (a) Show that the
user is interacting with the system. (b) Show that the dipstick has
been pulled out.

Certainty: 100%, based on direct check of ``hard'' system events (see
discussion of emotion intensity variables, below).

Goal: (preservation) Steve does not want the user to quit the tutoring
session without being familiar with where the dipstick is, and how to remove

Simulator messages meet these constraints: (a) User is quitting a
tutoring session. (b) Simulator message history shows that the user
has not removed the dipstick.

Certainty: 100%, based on history of actions user has performed
during the tutoring session.

Goal: The user should check the oil level before starting the compressor.

Simulator messages meet the first constraint following, and some
configuration of the latter constraints: (a) show that the user has
started the compressor. (b) Message history shows an instance of the
following situation: (1) Dipstick is pulled out, and Steve did not do
it. (2) Dipstick is in the user's field of vision when Steve pulls out,
and replaces, the dipstick. (3) It is reasonable to infer that the user was
looking at the dipstick.

Certainty: calculated as a function of context, and perceived
sensing on the part of the user.

The first goal above is a serendipity goal, wherein Steve is happy if the situation comes about but not distressed if it does not. The second is a preservation goal, wherein Steve may be distressed if the situation comes about, but is not happy if it does. In both cases, certainty will be 100% (the default value), and based on a direct examination of the messages reflecting user interactions with the ``hard'' virtual world.

With the third (bi-valenced) goal Steve may be happy if the user is perceived to have checked the dipstick before starting the compressor, but distressed if the user is perceived to have started the compressor without checking the oil level. In this case certainty will depend on the level of belief Steve has about whether or not the user has sensed the oil level on the dipstick.

Clark Elliott
Mon Mar 10 19:53:21 EST 1997