AAAI Publications, Twenty-Seventh AAAI Conference on Artificial Intelligence

Font Size: 
Mixed Observability Predictive State Representations
Sylvie C. W. Ong, Yuri Grinberg, Joelle Pineau

Last modified: 2013-06-30


Learning accurate models of agent behaviours is crucial for the purpose of controlling systems where the agents' and environment's dynamics are unknown. This is a challenging problem, but structural assumptions can be leveraged to tackle it effectively. In particular, many systems exhibit mixed observability, when observations of some system components are essentially perfect and noiseless, while observations of other components are imperfect, aliased or noisy. In this paper we present a new model learning framework, the mixed observability predictive state representation (MO-PSR), which extends the previously known predictive state representations to the case of mixed observability systems. We present a learning algorithm that is scalable to large amounts of data and to large mixed observability domains, and show theoretical analysis of the learning consistency and computational complexity. Empirical results demonstrate that our algorithm is capable of learning accurate models, at a larger scale than with the generic predictive state representation, by leveraging the mixed observability properties.


Machine Learning; Model Learning; Predictive State Representations; Mixed Observability

Full Text: PDF