Discarding Irrelevant Parameters in Hidden Markov Model Based Part-of-Speech Taggers

Eric Neufeld

A binary comparative definition of relevance, suggested by empirical results, gives a performance theory of relevance for hidden Markov models (HMMs) that makes it possible to reduce the total number of parameters in the model and while improving overall performance of the model in a specific application domain. Generalizations of this view of relevance are meaningful in many AI subareas. Another view of this result is that there are at least two kinds of relevance. Knowledge of high quality is more relevant to a conclusion than low quality knowledge; specific knowledge is more relevant that general knowledge. This work argues that one can only be had at the expense of the other.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.