R. James Firby and Marc G. Slack
We describe an implemented two layer architecture for real-time task execution in physical agents. The system integrates the RAP system with a set of dynamically configurable reactive processes called skills. Thus the notion of a primitive action is replaced by the selection of a set of skills that interact directly with each other and with the world to form a reactive agent. Using this approach, interpretation of sensory information becomes highly context dependent. For example, a short reading on a forward sensor might be a threat to be avoided or it may be the ticket counter being approached. To address this issue the RAP system has been augmented to allow RAPs at all levels to modify memory. This modification allows sensory information to be interpreted at the appropriate level of abstraction. We have implemented this system and used to control a robot navigating the halls at MITRE.