Integrating features from different sources for music information retrieval

Tao Li, Mitsunori Ogihara, Shenghuo Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations

Abstract

Efficient and intelligent music information retrieval is a very important topic of the 21st century. With the ultimate goal of building personal music information retrieval systems, this paper studies the problem of identifying "similar" artists using both lyrics and acoustic data. In this paper, we present a clustering algorithm that integrates features from both sources to perform bimodal learning. The algorithm is tested on a data set consisting of 570 songs from 53 albums of 41 artists using artist similarity provided by All Music Guide. Experimental results show that the accuracy of artist similarity classifiers can be significantly improved and that artist similarity can be efficiently identified.

Original languageEnglish (US)
Title of host publicationProceedings - Sixth International Conference on Data Mining, ICDM 2006
Pages372-381
Number of pages10
DOIs
StatePublished - Dec 1 2006
Externally publishedYes
Event6th International Conference on Data Mining, ICDM 2006 - Hong Kong, China
Duration: Dec 18 2006Dec 22 2006

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
ISSN (Print)1550-4786

Other

Other6th International Conference on Data Mining, ICDM 2006
CountryChina
CityHong Kong
Period12/18/0612/22/06

    Fingerprint

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Li, T., Ogihara, M., & Zhu, S. (2006). Integrating features from different sources for music information retrieval. In Proceedings - Sixth International Conference on Data Mining, ICDM 2006 (pp. 372-381). [4053064] (Proceedings - IEEE International Conference on Data Mining, ICDM). https://doi.org/10.1109/ICDM.2006.89