Content-based music similarity search and emotion detection

Research output: Contribution to journalConference article

59 Scopus citations

Abstract

This paper investigates the use of acoustic based features for music information retrieval. Two specific problems are studied: similarity search (searching for music sound files similar to a given music sound file) and emotion detection (detection of emotion in music sounds). The Daubechies Wavelet Coefficient Histograms (proposed by Li, Ogihara, and Li), which consist of moments of the coefficients calculated by applying the Db8 wavelet filter, are combined with the timbral features extracted using the MARSYAS system of Tzanetakis and Cook, to generate compact music features. For similarity search, the distance between two sound files is defined to be the Euclidean distance of their normalized representations. Based on the distance measure the closest sound files to an input sound file is obtained. Experiments on Jazz vocal and Classical sound files achieve a very high level of accuracy. Emotion detection is cast as a multiclass classification problem, decomposed as a multiple binary classification problem, and is resolved with the use of Support Vector Machines trained on the extracted features. Our experiments on emotion detection achieved reasonably accurate performance and provided some insights on future work.

Original languageEnglish (US)
Pages (from-to)V-705-V-708
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume5
StatePublished - Sep 27 2004
Externally publishedYes
EventProceedings - IEEE International Conference on Acoustics, Speech, and Signal Processing - Montreal, Que, Canada
Duration: May 17 2004May 21 2004

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Content-based music similarity search and emotion detection'. Together they form a unique fingerprint.

  • Cite this