P. Faratin, G. Lee, J. Wroclawski, and S. Parsons
We present a dynamic and adaptive decision model for an autonomous user agent whose task is to dynamically negotiate and procure wireless access for a mobile user. A user is assumed to have cognitive and motivational costs associated to providing subjective preference information to the agent. Therefore the task of the personal agent is to dynamically model the user, update its knowledge of a market of wireless service providers and select service providers that satisfies the user’s expected preferences based on minimal, or missing, information that is derived from a simple user interface. In this paper we show how this user modeling problem can be represented as a Markov Decision Process. Adaptive reinforcement learning solutions are then evaluated for two subclasses of tractable MDPs via simulations of some representative user models.