Michael Freed and Gregg Collins
Increasingly automated home, workplace, and industrial environments require programs capable of carrying out an ever wider assortment of tasks. As this trend continues, it will become increasingly difficult for computer programmers to anticipate all the ways in which these tasks may interact with one another. One solution to this problem is to attempt to automate the recognition of novel interactions between tasks. We have developed a general framework for learning from observed performance failures to cope with task interactions. We are testing our approach to learning about task interactions using the RAP task-execution system and Truckworld simulator.