AAAI Publications, Twenty-Fourth AAAI Conference on Artificial Intelligence

Font Size: 
Gaussian Mixture Model with Local Consistency
Jialu Liu, Deng Cai, Xiaofei He

Last modified: 2010-07-03

Abstract


Gaussian Mixture Model (GMM) is one of the most popular data clustering methods which can be viewed as a linear combination of different Gaussian components. In GMM, each cluster obeys Gaussian distribution and the task of clustering is to group observations into different components through estimating each cluster's own parameters. The Expectation-Maximization algorithm is always involved in such estimation problem. However, many previous studies have shown naturally occurring data may reside on or close to an underlying submanifold. In this paper, we consider the case where the probability distribution is supported on a submanifold of the ambient space. We take into account the smoothness of the conditional probability distribution along the geodesics of data manifold. That is, if two observations are close in intrinsic geometry, their distributions over different Gaussian components are similar. Simply speaking, we introduce a novel method based on manifold structure for data clustering, called Locally Consistent Gaussian Mixture Model (LCGMM). Specifically, we construct a nearest neighbor graph and adopt Kullback-Leibler Divergence as the distance measurement to regularize the objective function of GMM. Experiments on several data sets demonstrate the effectiveness of such regularization.

Full Text: PDF