Jae Woo Kim, Hesham Fouad, James K. Hahn
In this paper, we provide a method to generate a perceptually appropriate dance motion for an input music sound track. Our solution extracts musical features from input music and searches for a sequence of perceptually correlated motion segments from a dance motion database. We suggest a set of mapping criteria as well as motion features which are necessary to perform the mapping from music to dance motion.
Subjects: 6.4 Virtual Reality; 6.2 Multimedia