Multi-scale auralization for multimedia analytical feature interaction

Nguyen Le Thanh Nguyen, Hyunhwan Lee, Joseph Johnson, Mitsunori Ogihara, Gang Ren, James W. Beauchamp

Research output: Contribution to conferencePaper

Abstract

Modern human-computer interaction systems use multiple perceptual dimensions to enhance intuition and efficiency of the user by improving their situational awareness. A signal processing and interaction framework is proposed for auralizing signal patterns and augmenting the visualization-focused analysis tasks of social media content analysis and annotations, with the goal of assisting the user in analyzing, retrieving, and organizing relevant information for marketing research. Audio signals are generated from video/audio signal patterns as an auralization framework, for example, using the audio frequency modulation that follows the magnitude contours of video color saturation. The integration of visual and aural presentations will benefit the user interactions by reducing the fatigue level and sharping the users’ sensitivity, thereby improving work efficiency, confidence, and satisfaction.

Original languageEnglish (US)
StatePublished - Jan 1 2019
Event147th Audio Engineering Society International Convention 2019 - New York, United States
Duration: Oct 16 2019Oct 19 2019

Conference

Conference147th Audio Engineering Society International Convention 2019
CountryUnited States
CityNew York
Period10/16/1910/19/19

ASJC Scopus subject areas

  • Modeling and Simulation
  • Acoustics and Ultrasonics

Fingerprint Dive into the research topics of 'Multi-scale auralization for multimedia analytical feature interaction'. Together they form a unique fingerprint.

  • Cite this

    Le Thanh Nguyen, N., Lee, H., Johnson, J., Ogihara, M., Ren, G., & Beauchamp, J. W. (2019). Multi-scale auralization for multimedia analytical feature interaction. Paper presented at 147th Audio Engineering Society International Convention 2019, New York, United States.