A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control

Jiaxin Ma, Yu Zhang, Andrzej Cichocki, Fumitoshi Matsuno

Результат исследований: Вклад в журналСтатьярецензирование

149 Цитирования (Scopus)


This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.

Язык оригиналаАнглийский
Номер статьи6954402
Страницы (с-по)876-889
Число страниц14
ЖурналIEEE Transactions on Biomedical Engineering
Номер выпуска3
СостояниеОпубликовано - 1 мар. 2015
Опубликовано для внешнего пользованияДа


Подробные сведения о темах исследования «A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control». Вместе они формируют уникальный семантический отпечаток (fingerprint).