Electrooculogram (EOG) signals are potential responses generated by eye movements, and event related potential (ERP) is a special electroencephalogram (EEG) pattern which evoked by external stimuli. Both EOG and ERP have been used separately for implementing human-machine interfaces which can assist disabled patients in performing daily tasks. In this paper, we present a novel EOG/ERP hybrid human-machine interface which integrates the traditional EOG and ERP interfaces together. Eye movements like the blink, wink, gaze, and frown are detected from EOG signals using double threshold algorithm. Multiple ERP components, i.e., N170, VPP and P300 are evoked by inverted face stimuli and classified by linear discriminant analysis (LDA). Based on this hybrid interface, we also design a control scheme for the humanoid robot NAO (Aldebaran robotics, Inc). On-line experiment results show that the proposed hybrid interface can effectively control the robot's basic movements and order it to make various behaviors. While normally operating the robot by hands takes 49.1 s to complete the experiment sessions, using the proposed EOG/ERP interface, the subject is able to finish the sessions in 54.1 s.