AAAI Publications, The Twenty-Sixth International FLAIRS Conference

Font Size: 
Bias and Variance Optimization for SVMs Model Selection
Alejandro Rosales-Pérez, Hugo Jair Escalante, Jesus A. Gonzalez, Carlos A. Reyes-Garcia

Last modified: 2013-05-19

Abstract


Support vector machines (SVMs) are among the most used methods for pattern recognition. Acceptable results have been obtained with such methods in many domains and applications. However, as most learning algorithms, SVMs have hyperparameters that influence the effectiveness of the generated model. Thus, choosing adequate values for such hyperparameters is critical in order to obtain satisfactory results for a given classification task, a problem known as model selection. This paper introduces a novel model selection approach for SVMs based on multi-objective optimization and on the bias and variance definition. We propose an evolutionary algorithm that aims to select the configuration of hyperparameters that optimizes a trade-off between estimates of bias and variance; two factors that are closely related to the model accuracy and complexity. The proposed technique is evaluated using a suite of benchmark data sets for classification. Experimental results show the validity of our approach. We found that the model selection criteria resulted very helpful for selecting highly effective classification models.

Full Text: PDF