AAAI Publications, Twenty-Eighth AAAI Conference on Artificial Intelligence

Font Size: 
Robust Non-Negative Dictionary Learning
Qihe Pan, Deguang Kong, Chris Ding, Bin Luo

Last modified: 2014-06-21


Dictionary learning plays an important role in machine learning, where data vectors are modeled as a sparse linear combinations of basis factors (i.e., dictionary). However, how to conduct dictionary learning in noisy environment has not been well studied. Moreover, in practice, the dictionary (i.e., the lower rank approximation of the data matrix) and the sparse representations are required to be nonnegative, such as applications for image annotation, document summarization, microarray analysis. In this paper, we propose a new formulation for non-negative dictionary learning in noisy environment, where structure sparsity is enforced on sparse representation. The proposed new formulation is also robust for data with noises and outliers, due to a robust loss function used. We derive an efficient multiplicative updating algorithm to solve the optimization problem, where dictionary and sparse representation are updated iteratively. We prove the convergence and correctness of proposed algorithm rigorously.We show the differences of dictionary at different level of sparsity constraint.The proposed algorithm can be adapted for clustering and semi-supervised learning.


dictionary; NMF; robust

Full Text: PDF