AAAI Publications, Twenty-Eighth AAAI Conference on Artificial Intelligence

Font Size: 
Gradient Descent with Proximal Average for Nonconvex and Composite Regularization
Wenliang Zhong, James T. Kwok

Last modified: 2014-06-21


Sparse modeling has been highly successful in many real-world applications. While a lot of interests have been on convex regularization, recent studies show that nonconvexregularizers can outperform their convex counterparts in many situations.However, the resulting nonconvex optimization problems are often challenging, especiallyfor composite regularizers such as the nonconvex overlapping group lasso. In thispaper, byusing a recent mathematical tool known as the proximal average,we propose a novel proximal gradient descent method for optimization with a wide class of nonconvex and composite regularizers.Instead of directlysolving the proximal stepassociated with a composite regularizer, we average thesolutions from the proximal problems of the constituent regularizers. This simple strategy has guaranteed convergenceand low per-iteration complexity.Experimental results on a number of synthetic andreal-world data sets demonstrate the effectiveness and efficiency of theproposed optimization algorithm, and also the improved classification performanceresulting from thenonconvex regularizers.


nonconvex optimization, composite regularization

Full Text: PDF