AAAI Publications, Twenty-Ninth AAAI Conference on Artificial Intelligence

Font Size: 
Random Gradient Descent Tree: A Combinatorial Approach for SVM with Outliers
Hu Ding, Jinhui Xu

Last modified: 2015-02-21

Abstract


Support Vector Machine (SVM) is a fundamental technique in machine learning. A long time challenge facing SVM is how to deal with outliers (caused by mislabeling), as they could make the classes in SVM nonseparable. Existing techniques, such as soft margin SVM, ν-SVM, and Core-SVM, can alleviate the problem to certain extent, but cannot completely resolve the issue. Recently, there are also techniques available for explicit outlier removal. But they suffer from high time complexity and cannot guarantee quality of solution. In this paper, we present a new combinatorial approach, called Random Gradient Descent Tree (or RGD-tree), to explicitly deal with outliers; this results in a new algorithm called RGD-SVM. Our technique yields provably good solution and can be efficiently implemented for practical purpose. The time and space complexities of our approach only linearly depend on the input size and the dimensionality of the space, which are significantly better than existing ones. Experiments on benchmark datasets suggest that our technique considerably outperforms several popular techniques in most of the cases.

Keywords


SVM; Outliers; Robust algorithms; Random sampling; Gradient Descent; Boosting

Full Text: PDF