Multi-Label Dimensionality Reduction via Dependency Maximization

Yin Zhang, Zhi-Hua Zhou

Multi-label learning deals with data associated with multiple labels simultaneously. Like other machine learning and data mining tasks, multi-label learning also suffers from the \textit{curse of dimensionality}. Although dimensionality reduction has been studied for many years, multi-label dimensionality reduction remains almost untouched. In this paper, we propose a multi-label dimensionality reduction method, MDDM, which attempts to project the original data into a lowerdimensional feature space maximizing the dependence between the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Criterion, we derive a closed-form solution which enables the dimensionality reduction process to be efficient. Experiments validate the performance of MDDM.

Subjects: 12. Machine Learning and Discovery; 12.2 Scientific Discovery

Submitted: Apr 12, 2008


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.