How Actors Can Animate Game Characters: Integrating Performance Theory in the Emotion Model of a Game Character

Authors

  • Sheldon Schiffer Georgia State University

DOI:

https://doi.org/10.1609/aiide.v15i1.5252

Abstract

Despite the development of sophisticated emotion models, game character facial animation is still often completed with laborious hand-controlled key framing, only marginally assisted by automation. Behavior trees and animation state machines are used mostly to manage animation transitions of physical business, like walking or lifting objects. Attempts at automated facial animation rely on discrete facial iconic emotion poses, thus resulting in mechanical “acting”. The techniques of acting instructor and theorist Sanford Meisner reveal a process of role development whose character model resembles components of Appraisal Theory. The similarity conjures an experiment to discover if an “emotion engine” and workflow method can model the emotions of an autonomous animated character using actor-centric techniques. Success would allow an animation director to autonomously animate a character’s face with the subtlety of gradient expressions. Using a head-shot video stream of one of two actors performing a structured Meisner-esque improvisation as the primary data source, this research demonstrates the viability of an actor-centric workflow to create an autonomous facial animation system.

Downloads

Published

2019-10-08

How to Cite

Schiffer, S. (2019). How Actors Can Animate Game Characters: Integrating Performance Theory in the Emotion Model of a Game Character. Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, 15(1), 227-229. https://doi.org/10.1609/aiide.v15i1.5252