Composing POMDP-Based Building Blocks to Analyze Large-Scale Multiagent Systems

Hyuckchul Jung and Milind Tambe

Given a large group of cooperative agents, selecting the right coordination or conflict resolution strategy can have a significant impact on their performance (e.g., speed of convergence). While performance models of such coordination or conflict resolution strategies could aid in selecting the right strategy for a given domain, such models remain largely uninvestigated in the multiagent literature. This paper takes a step towards applying the recently emerging distributed POMDP (partially observable markov decision process) frameworks, such as the MTDP (markov team decision process) in service of creating such performance models. To address issues of scale-up, we use small-scale models, called building blocks that represent the local interaction among a small group of agents. We discuss several ways to combine building blocks for performance prediction of a larger-scale multiagent system. Our approach is presented in the context of DCSP (distributed constraint satisfaction problem), where we are able to predict the performance of five different DCSP strategies in different domain settings by modeling and combining building blocks. Our approach points the way to new tools based on building blocks for performance analysis in multiagent systems.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.