Theoretical Analysis of Label Distribution Learning

Authors

  • Jing Wang Southeast University
  • Xin Geng Southeast University

DOI:

https://doi.org/10.1609/aaai.v33i01.33015256

Abstract

As a novel learning paradigm, label distribution learning (LDL) explicitly models label ambiguity with the definition of label description degree. Although lots of work has been done to deal with real-world applications, theoretical results on LDL remain unexplored. In this paper, we rethink LDL from theoretical aspects, towards analyzing learnability of LDL. Firstly, risk bounds for three representative LDL algorithms (AA-kNN, AA-BP and SA-ME) are provided. For AA-kNN, Lipschitzness of the label distribution function is assumed to bound the risk, and for AA-BP and SA-ME, rademacher complexity is utilized to give data-dependent risk bounds. Secondly, a generalized plug-in decision theorem is proposed to understand the relation between LDL and classification, uncovering that approximation to the conditional probability distribution function in absolute loss guarantees approaching to the optimal classifier, and also data-dependent error probability bounds are presented for the corresponding LDL algorithms to perform classification. As far as we know, this is perhaps the first research on theory of LDL.

Downloads

Published

2019-07-17

How to Cite

Wang, J., & Geng, X. (2019). Theoretical Analysis of Label Distribution Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 5256-5263. https://doi.org/10.1609/aaai.v33i01.33015256

Issue

Section

AAAI Technical Track: Machine Learning