EEG Correlates of Voice and Face Emotional Judgments in the Human Brain

K. Hiyoshi-Taniguchi, M. Kawasaki, T. Yokota, H. Bakardjian, H. Fukuyama, A. Cichocki, F. B. Vialatte

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

The purpose of this study is to clarify the neural correlates of human emotional judgment. This study aimed to induce a controlled perturbation in the emotional system of the brain by multimodal stimuli, and to investigate whether such emotional stimuli could induce reproducible and consistent changes in electroencephalography (EEG) signals. We exposed 12 subjects to auditory, visual, or combined audio–visual stimuli. Audio stimuli consisted of voice recordings of the Japanese word “arigato” (thank you) pronounced with three different intonations (angry—A, happy—H or neutral—N). Visual stimuli consisted of faces of women expressing the same emotional valences (A, H or N). Audio–visual stimuli were composed using either congruent combinations of faces and voices (e.g., H × H) or noncongruent combinations (e.g., A × H). The data were collected using an EEG system, and analysis was performed by computing the topographic distributions of EEG signals in the theta, alpha, and beta frequency ranges. We compared the conditions stimuli (A, H or N), and congruent versus noncongruent. Topographic maps of EEG power differed between those conditions. The obtained results demonstrate that EEG could be used as a tool to investigate emotional valence and discriminate various emotions.

Original languageEnglish
Pages (from-to)11-19
Number of pages9
JournalCognitive Computation
Volume7
Issue number1
DOIs
Publication statusPublished - 2013
Externally publishedYes

Keywords

  • EEG
  • Emotion
  • Multimodal

Fingerprint

Dive into the research topics of 'EEG Correlates of Voice and Face Emotional Judgments in the Human Brain'. Together they form a unique fingerprint.

Cite this