IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations

Dorien Herremans, Simin Yang, Ching Hua Chuan, Mathieu Barthet, Elaine Chew

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.

Original languageEnglish (US)
Title of host publicationProceedings of the 12th International Audio Mostly Conference
Subtitle of host publicationAugmented and Participatory Sound and Music Experiences, AM 2017
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450353731
DOIs
StatePublished - Aug 23 2017
Externally publishedYes
Event12th International Audio Mostly Conference, AM 2017 - London, United Kingdom
Duration: Aug 23 2017Aug 26 2017

Publication series

NameACM International Conference Proceeding Series
VolumePart F131930

Other

Other12th International Audio Mostly Conference, AM 2017
CountryUnited Kingdom
CityLondon
Period8/23/178/26/17

Keywords

  • Arousal/valence
  • Computational musicology
  • Emotion
  • Emotion visualisation
  • Multimodal user interface
  • Music

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations'. Together they form a unique fingerprint.

Cite this