Craig Boutilier, Ronen I. Brafman, Christopher Geib
Recent research in decision theoretic planning has focussed on making the solution of Markov decision processes (MDPs) more feasible. We develop a set of algorithms for structured. reachability analysis of MDP.s that are suitable when an initial state (or set of states) is known. Using compact, structured representations of MDPs (e.g., Bayesian networks), our methods---which vary in the tradeoff between complexity and accuracy---produce structured descriptions of (estimated) reachable states that can be used to eliminate variables or variables values from the problem description, reducing the size of the MDP and making it easier to solve. Furthermore, the results of our methods can be used by existing (exact and approximate) abstraction algorithms for MDPs.