An EEG-based Approach for Exoskeleton Control

Posted in Research

This was one of my undergraduate projects, and was funded by a College Undergraduate Research Scholarship from the National Science Council. I developed a program that utilized a commercial brain-machine interface (BCI) device to control both a LEGO MINDSTORMS robot and a powered upper-limb exoskeleton. The former task, control of the LEGO robot, served as a pilot experiment to test the algorithm.

System Architecture

The hardware setup includes a commercial BCI device, Emotiv EPOC, a LEGO Mindstorms NXT robot, a powered upper-limb exoskeleton, and a personal computer (PC). The Emotiv EPOC neuroheadset, as shown in Fig. 1 (a), was used to acquire EEG signals, and the data was transferred wirelessly to a personal computer. The neuroheadset was accompanied by a self-development kit (SDK) for software development, while the LEGO Mindstorms NXT was used to construct a dual wheels robot in order to test the program before applying it to the exoskeleton, as illustrated in Fig. 1 (b). In this study, the program was executed on the computer, while the NXT brick was connected to the PC wirelessly through Bluetooth.

The exoskeleton has two degrees of freedom which were actuated by two electrical drives. The axes were located on the elbow and shoulder joints, respectively, as shown in Fig. 1 (c). The exoskeleton is the property of the Biophotonics and Bioimaging Laboratory, Department of Bio-industrial Mechatronics Engineering, National Taiwan University.

 system_EEG

Fig. 1 (a) The Emotiv EPOC neuroheadset. (b) LEGO Mindstorms NXT robot, which was used for the pilot experiment. (c) A two degrees of freedom upper-limb exoskeleton.

The software development environment was Visual Studio 2008 and used C/C++ as the programming language. The dynamic link libraries (DLLs) provided by Emotiv were used to analyze EEG signals. The system architecture is illustrated in Fig. 2. The Emotiv EPOC sent EEG data wirelessly for real-time analysis, and the result was then mapped to the event that the robot would execute. For the LEGO robot, the command was sent from a personal computer to NXT via Bluetooth; for the exoskeleton, the command was sent via a physical wire, which had a RS-232 signal type with the baud rate of 9600.

sys_EEG_robots_3

Fig. 2 System architecture.

Methods

Prior to the experiment, the subject needed to establish a profile for specific mental states; this can be accomplished by the SDK provided from Emotiv. During the training process, the subject was first told to remain neutral for 8 seconds, and then focus on a specific event, such as push or turn right, for another 8 seconds. The recorded patterns were stored and could be recalled from the program. The SDK has the capability to store up to 4 different mental states (neutral state is not counted). However, in this project, two mental states were sufficient. Furthermore, the SDK also provided abundant application programming interfaces (API) for software development. The Emotiv API was exposed as an ANSI C interface that was declared in 3 header files and implemented in 2 Windows DLLs. With the API functions provided, the EEG signals could be analyzed in real- time. The outputs of the analysis contain two arguments: mental state and the consistency of the state, where the consistency of the state was considered as power. The procedure is illustrated as Fig. 3, where state 1 and state 2 are the specific mental states that were trained.

eeg_flowchart

Fig. 3 Block diagram of the analysis procedure.

For controlling the NXT robot, the mental states returned were used to choose the behaviors of the robot. In addition to the neutral state, there were two more mental states which subjects were trained for: push and turn. If the mental state was neutral, the two motors of the robot would brake; if the mental state was push, the robot would drive forward; and if the mental state was turn, the robot will turn clockwise. The speeds of the motors were decided by the consistency of the state. The control strategy for the exoskeleton is different from the NXT robot. Because the exoskeleton was intended to assist rehabilitation, the speeds of the motor drives were set to constant during the experiment and could be modified manually. The exoskeleton actuates based on the consistency level returned by the API function.

 Video Demonstration

Here is a short video that demonstrates this work:

Related Article

  • T. H. Hsieh, "An EEG-based Approach for Exoskeleton Control," Final Report of College Undergraduate Research Project, 2011 [pdf]