Abstraction in Predictive State Representations

Vishal Soni, Satinder Singh

Most work on Predictive Representations of State (PSRs) focuses on learning a complete model of the system that can be used to answer any question about the future. However, we may be interested only in answering certain kinds of abstract questions. For instance, we may only care about the presence of objects in an image rather than pixel level details. In such cases, we may be able to learn substantially smaller models that answer only such abstract questions. We present the framework of PSR homomorphisms for model abstraction in PSRs. A homomorphism transforms a given PSR into a smaller PSR that provides exact answers to abstract questions in the original PSR. As we shall show, this transformation captures structural and temporal abstractions in the original PSR.

Subjects: 12. Machine Learning and Discovery; 12.1 Reinforcement Learning

Submitted: Apr 24, 2007


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.