IMMA-Emo

A multimodal interface for visualising score- and audio-synchronised emotion annotations

Dorien Herremans, Simin Yang, Ching-Hua Chuan, Mathieu Barthet, Elaine Chew

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.

Original languageEnglish (US)
Title of host publicationProceedings of the 12th International Audio Mostly Conference
Subtitle of host publicationAugmented and Participatory Sound and Music Experiences, AM 2017
PublisherAssociation for Computing Machinery
VolumePart F131930
ISBN (Electronic)9781450353731
DOIs
StatePublished - Aug 23 2017
Externally publishedYes
Event12th International Audio Mostly Conference, AM 2017 - London, United Kingdom
Duration: Aug 23 2017Aug 26 2017

Other

Other12th International Audio Mostly Conference, AM 2017
CountryUnited Kingdom
CityLondon
Period8/23/178/26/17

Fingerprint

Time series
Experiments

Keywords

  • Arousal/valence
  • Computational musicology
  • Emotion
  • Emotion visualisation
  • Multimodal user interface
  • Music

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Herremans, D., Yang, S., Chuan, C-H., Barthet, M., & Chew, E. (2017). IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations. In Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences, AM 2017 (Vol. Part F131930). [a11] Association for Computing Machinery. https://doi.org/10.1145/3123514.3123545

IMMA-Emo : A multimodal interface for visualising score- and audio-synchronised emotion annotations. / Herremans, Dorien; Yang, Simin; Chuan, Ching-Hua; Barthet, Mathieu; Chew, Elaine.

Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences, AM 2017. Vol. Part F131930 Association for Computing Machinery, 2017. a11.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Herremans, D, Yang, S, Chuan, C-H, Barthet, M & Chew, E 2017, IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations. in Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences, AM 2017. vol. Part F131930, a11, Association for Computing Machinery, 12th International Audio Mostly Conference, AM 2017, London, United Kingdom, 8/23/17. https://doi.org/10.1145/3123514.3123545
Herremans D, Yang S, Chuan C-H, Barthet M, Chew E. IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations. In Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences, AM 2017. Vol. Part F131930. Association for Computing Machinery. 2017. a11 https://doi.org/10.1145/3123514.3123545
Herremans, Dorien ; Yang, Simin ; Chuan, Ching-Hua ; Barthet, Mathieu ; Chew, Elaine. / IMMA-Emo : A multimodal interface for visualising score- and audio-synchronised emotion annotations. Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences, AM 2017. Vol. Part F131930 Association for Computing Machinery, 2017.
@inproceedings{522fc1a960e844b1b8010d183f72942b,
title = "IMMA-Emo: A multimodal interface for visualising score- and audio-synchronised emotion annotations",
abstract = "Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.",
keywords = "Arousal/valence, Computational musicology, Emotion, Emotion visualisation, Multimodal user interface, Music",
author = "Dorien Herremans and Simin Yang and Ching-Hua Chuan and Mathieu Barthet and Elaine Chew",
year = "2017",
month = "8",
day = "23",
doi = "10.1145/3123514.3123545",
language = "English (US)",
volume = "Part F131930",
booktitle = "Proceedings of the 12th International Audio Mostly Conference",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - IMMA-Emo

T2 - A multimodal interface for visualising score- and audio-synchronised emotion annotations

AU - Herremans, Dorien

AU - Yang, Simin

AU - Chuan, Ching-Hua

AU - Barthet, Mathieu

AU - Chew, Elaine

PY - 2017/8/23

Y1 - 2017/8/23

N2 - Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.

AB - Emotional response to music is often represented on a two-dimensional arousal-valence space without reference to score information that may provide critical cues to explain the observed data. To bridge this gap, we present IMMA-Emo, an integrated software system for visualising emotion data aligned with music audio and score, so as to provide an intuitive way to interactively visualise and analyse music emotion data. The visual interface also allows for the comparison of multiple emotion time series. The IMMA-Emo system builds on the online interactive Multi-modal Music Analysis (IMMA) system. Two examples demonstrating the capabilities of the IMMA-Emo system are drawn from an experiment set up to collect arousal-valence ratings based on participants' perceived emotions during a live performance. Direct observation of corresponding score parts and aural input from the recording allow explanatory factors to be identified for the ratings and changes in the ratings.

KW - Arousal/valence

KW - Computational musicology

KW - Emotion

KW - Emotion visualisation

KW - Multimodal user interface

KW - Music

UR - http://www.scopus.com/inward/record.url?scp=85038403296&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85038403296&partnerID=8YFLogxK

U2 - 10.1145/3123514.3123545

DO - 10.1145/3123514.3123545

M3 - Conference contribution

VL - Part F131930

BT - Proceedings of the 12th International Audio Mostly Conference

PB - Association for Computing Machinery

ER -