AAAI Publications, Twenty-Fourth AAAI Conference on Artificial Intelligence

Font Size: 
A Topic Model for Linked Documents and Update Rules for its Estimation
Zhen Guo, Shenghuo Zhu, Zhongfei Zhang, Yun Chi, Yihong Gong

Last modified: 2010-07-03


The latent topic model plays an important role in the unsupervised learning from a corpus, which provides a probabilistic interpretation of the corpus in terms of the latent topic space. An underpinning assumption which most of the topic models are based on is that the documents are assumed to be independent of each other. However, this assumption does not hold true in reality and the relations among the documents are available in different ways, such as the citation relations among the research papers. To address this limitation, in this paper we present a Bernoulli Process Topic (BPT) model, where the interdependence among the documents is modeled by a random Bernoulli process. In the BPT model a document is modeled as a distribution over topics that is a mixture of the distributions associated with the related documents. Although BPT aims at obtaining a better document modeling by incorporating the relations among the documents, it could also be applied to many applications including detecting the topics from corpora and clustering the documents. We apply the BPT model to several document collections and the experimental comparisons against several state-of-the-art approaches demonstrate the promising performance.


unsupervised learning; topic model; text mining

Full Text: PDF