Monica H Ou, Geoff A W West, Mihai Lazarescu, Chris Clay
In most case-based reasoning (CBR) systems there has been little research done on validating new knowledge, specifically on how previous knowledge differs from current knowledge as a result of conceptual change. This paper proposes two methods that enable the domain expert, who is non-expert in artificial intelligence (AI), to interactively supervise the knowledge validation process in a CBR system, and to enable dynamic updating of the system, to provide the best diagnostic questions. The first method is based on formal concept analysis which involves a graphical representation and comparison of the concepts, and a summary description highlighting the conceptual differences. We propose a dissimilarity metric for measuring the degree of variation between the previous and current concepts when a new case is added to the knowledge base. The second method involves determining unexpected classification-based association rules to form critical questions as the knowledge base gets updated.
Content Area: 2. Analogical and Case-Based Reasoning
Subjects: 3.1 Case-Based Reasoning; 1.7 Expert Systems
Submitted: May 10, 2005
This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.