Information fusion for perceptual feedback: A brain activity sonification approach

Tomasz M. Rutkowski, Andrzej Cichocki, Danilo Mandic

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

12 Citations (Scopus)

Abstract

When analysing multichannel processes, it is often convenient to use some sort of visualisation to help understand and interpret spatio-temporal dependencies between the channels, and to perform input variable selection. This is particularly advantageous when the levels of noise are high, the active channel changes its spatial location with time, and also for spatio-temporal processes where several channels contain meaningful information, such as in the case of electroencephalogram (EEG)-based brain activity monitoring. To provide insight into the dynamics of brain electrical responses, spatial sonification of multichannel EEG is performed, whereby the information from active channels is fused into music-like audio. Owing to its data fusion via fission mode of operation, empirical mode decomposition (EMD) is employed as a time-frequency analyser, and the brain responses to visual stimuli are sonified to provide audio feedback. Such perceptual feedback has enormous potential in multimodal brain computer and brain machine interfaces (BCI/BMI).

Original languageEnglish
Title of host publicationSignal Processing Techniques for Knowledge Extraction and Information Fusion
PublisherSpringer US
Pages261-273
Number of pages13
ISBN (Print)9780387743660
DOIs
Publication statusPublished - 2008
Externally publishedYes

Fingerprint

Dive into the research topics of 'Information fusion for perceptual feedback: A brain activity sonification approach'. Together they form a unique fingerprint.

Cite this