A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control

Jiaxin Ma, Yu Zhang, Andrzej Cichocki, Fumitoshi Matsuno

Research output: Contribution to journalArticlepeer-review

145 Citations (Scopus)


This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.

Original languageEnglish
Article number6954402
Pages (from-to)876-889
Number of pages14
JournalIEEE Transactions on Biomedical Engineering
Issue number3
Publication statusPublished - 1 Mar 2015
Externally publishedYes


  • Electroencephalogram (EEG)
  • Electrooculogram (EOG)
  • event-related potential (ERP)
  • human-machine interface (HMI)
  • robot control


Dive into the research topics of 'A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control'. Together they form a unique fingerprint.

Cite this