AAAI Publications, Twenty-Eighth AAAI Conference on Artificial Intelligence

Font Size: 
Feature-Cost Sensitive Learning with Submodular Trees of Classifiers
Matt Kusner, Wenlin Chen, Quan Zhou, Zhixiang (Eddie) Xu, Kilian Weinberger, Yixin Chen

Last modified: 2014-06-21

Abstract


During the past decade, machine learning algorithms have become commonplace in large-scale real-world industrial applications. In these settings, the computation time to train and test machine learning algorithms is a key consideration. At training-time the algorithms must scale to very large data set sizes.At testing-time, the cost of feature extraction can dominate the CPU runtime. Recently, a promising method was proposed to account for the feature extraction cost at testing time, called Cost-sensitive Tree of Classifiers (CSTC). Although the CSTC problem is NP-hard, the authors suggest an approximation through a mixed-norm relaxation across many classifiers. This relaxation is slow to train and requires involved optimization hyperparameter tuning. We propose a different relaxation using approximate submodularity, called Approximately Submodular Tree of Classifiers (ASTC). ASTC is much simpler to implement, yields equivalent results but requires no optimization hyperparameter tuning and is up to two orders of magnitude faster to train.

Keywords


submodular optimization; feature-cost sensitive learning; budgeted learning; tree-based learning

Full Text: PDF