Jack Breese and Gene Ball
We describe a framework for constructing a model of emotions and personality for a computational agent. The architecture uses dynamic models of emotions and personality encoded as Bayesian networks to 1) diagnose the emotions and personality of the user, and 2) generate appropriate behavior by an automated agent. Classes of interaction that are interpreted and/or generated include such things as choice of wording, characteristics of speech (speed and pitch), gesture, and facial expression. In particular, we describe the structure of dynamic Bayesian networks (DBNs) that form the basis for the interpretation and generation, and address assessment and calibration of static and dynamic components.