GOM-face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control

Yunjun Nam, Bonkon Koo, Andrzej Cichocki, Seungjin Choi

Research output: Contribution to journalArticlepeer-review

80 Citations (Scopus)

Abstract

We present a novel human-machine interface, called GOM-Face, and its application to humanoid robot control. The GOM-Face bases its interfacing on three electric potentials measured on the face: 1) glossokinetic potential (GKP), which involves the tongue movement; 2) electrooculogram (EOG), which involves the eye movement; 3) electromyogram, which involves the teeth clenching. Each potential has been individually used for assistive interfacing to provide persons with limb motor disabilities or even complete quadriplegia an alternative communication channel. However, to the best of our knowledge, GOM-Face is the first interface that exploits all these potentials together. We resolved the interference between GKP and EOG by extracting discriminative features from two covariance matrices: a tongue-movement-only data matrix and eye-movement-only data matrix. With the feature extraction method, GOM-Face can detect four kinds of horizontal tongue or eye movements with an accuracy of 86.7% within 2.77 s. We demonstrated the applicability of the GOM-Face to humanoid robot control: users were able to communicate with the robot by selecting from a predefined menu using the eye and tongue movements.

Original languageEnglish
Article number6589166
Pages (from-to)453-462
Number of pages10
JournalIEEE Transactions on Biomedical Engineering
Volume61
Issue number2
DOIs
Publication statusPublished - Feb 2014
Externally publishedYes

Keywords

  • Electromyogram (EMG)
  • electrooculogram (EOG)
  • glossokinetic potentials (GKP)
  • human-machine interface
  • multimodal interface

Fingerprint

Dive into the research topics of 'GOM-face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control'. Together they form a unique fingerprint.

Cite this