Using Attention in Belief Revision

Xueming Huang, Gordon I. McCalla, Eric Neufeld

Belief revision for an intelligent system is usually computationally expensive. Here we tackle this problem by using focus in belief revision: that is, revision occurs only in a subset of beliefs under attention (or in focus). Attention can be shifted within the belief base, thus allowing use and revision of other subsets of beliefs. This attention-shifting belief revision architecture shows promise to allow efficient and natural revision of belief bases.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.