IPSIHAND BRAVO: An Improved EEG-based Brain-Computer Interface for Hand Motor Control Rehabilitation [Washington University in St. Louis]

Mark Wronkiewicz, Charles Damian Holmes, Jenny Liu, Elizabeth Russell, Colleen Rhoades,  Jason Dunkley, Thane Somers


Stroke and other nervous system injuries can damage or destroy hand motor control and greatly upset daily activities. Brain computer interfaces (BCIs) represent an emerging technology that bypasses damaged nerves to restore basic motor function and provide more effective rehabilitation. Existing BCI systems utilize signals from the contralateral motor cortex; however, this region is often damaged by strokes. Therefore, the wireless IpsiHand Bravo BCI system was implemented to provide an alternative method of stroke rehabilitation using ipsilateral electroencephalographic (EEG) signals to move a hand orthosis in real time. The system features a machine learning signal processing algorithm and a customizable orthosis fitted to the patient. IpsiHand was designed using commercially available components with cost and possibility of  independent use (outside healthcare facilities) in mind  so that it may reach a large demographic. The device immediately restores basic hand functionality, and can also induce neural plasticity over time so that the wearer relearns how to independently control his or her hand.



Every year in the United States, nearly three quarters of a million people suffer a stroke [1]. Along with other forms of damage to the central nervous system, e.g. traumatic brain injury, spinal cord injuries, and neurodegenerative diseases, strokes can cause devastating loss to functional motor control. Hand impairment is often a lasting result of these conditions, and there are approximately 900,000 people suffering from severe grasping impairment in the U.S who face difficulty performing everyday tasks [2].

After a stroke or other form of nerve damage, there is a three month window in which rehabilitation efforts are most effective [3]. However, the average stroke rehabilitation patient spends fewer than five hours at the clinic each week [4] despite research that shows rehabilitation outcomes are time dose-dependent [5]. Furthermore, conventional therapeutic techniques require that the patient exhibits some residual movement post-injury as a starting point for rehabilitation and entails frequent visits to the therapist’s office

A technology called brain computer interfaces (BCIs) offers hope of motor rehabilitation for patients with manyl forms of brain or central or peripheral nerve damage, without the limitations of conventional therapy. BCIs work by processing brain activity, and can be used to control a computer or devices interfacing with that computer thereby bypassing the damaged or destroyed biological nerve pathway.

There are three major methods of obtaining voltage signals from the brain: electroencephalography (EEG) (scalp surface recordings), electrocorticography (ECoG) (brain surface recording), and single unit recordings (intra-brain recordings). Due to the low invasiveness, cost, and risk of infection associated with scalp voltage recordings, IpsiHand Bravo uses an EEG signal acquisition method[6].



A graphical representation of the system overview can be seen in Fig. 1.

EEG signals are sent wirelessly to a computer with EOS machine learning software, which sends the control signal wirelessly to the orthosis.

Figure 1 - Ipsihand Bravo System Overview. In summary, an EEG headset is used to acquire signals from the subject’s brain. That signal, transmitted wirelessly to a computer, trains a machine learning system. This machine learning system processes brain signals to estimate the subject’s intention to open or close his or her hand. This intention is used to control the position of a worn motorized exoskeletal orthosis via a Bluetooth Arduino unit.


Signal Acquisition Hardware

EEG voltage signals were acquired from the subject’s scalp using a commercial, dry-electrode Emotiv EPOC (Emotiv; Australia) headset. The headset aligns, band-pass filters, and digitizes the signal at 128 Hz. This signal is transmitted wirelessly to a laptop for processing.

Emotiv electrodes are located over international system positions: AF3, F3, F7, FC5, T7, P7, O1, O2, P8, T8, FC6, F8, F4, and AF4.

Figure 2 - Spatial Location of Electrodes for the Emotiv EPOC Headset. The EPOC has 14 electrodes located over 10-20 international system positions as shown here. The EPOC also has 2 reference electrodes positioned behind each ear.


Machine Learning Software

Ipsihand Bravo makes use of the Eos Machine Learning Software Suite, developed for Ipsihand. The system overview of the Eos software is visually represented in Fig. 3 and fully explained in the following subsections.

The training subunit establishes user-optimized parameters from recorded EEG signal. Parameters are used to provide a real-time control signal.

Figure 3 - System Overview of Eos Machine Learning Software. This software suite contains three distinct subunits: 1) A signal recording subunit, 2) a training subunit, and 3) a real-time subunit. Subunit 1 and 2 are a one-time calibration step, and the third subunit is for orthosis control.


Eos: Signal Recording

The signal recording subunit produces a data file which contains the raw data collected during each of 90 trials. During a trial, the user is presented with one of two prompts to solicit a response. While the user is acting, the EEG signals from the EPOC electrodes are recorded and associated with the prompt. The two prompts are “move” and “pause”. During the “move” prompt, the user moves, or imagines moving, his or her hand in a specific way. During the “pause” prompt, the user remains motionless and is asked to not imagine hand motion.


Eos: Training

The training subunit reads in the signal file. Frequency analysis is carried to produce numerous features for each electrode and frequency. Using each trial’s feature set, feature extraction is carried out using Kernel Principle Component Analysis (KPCA) [8]. These features contain more variance per feature than the original feature set, giving these features additional discriminative potential. The extracted feature set is then used to produce a Support Vector Machine model with the software library LibSVM [9]. By employing an iterative optimization method, an ideal set of function parameters, which best estimates the user’s hand motion, is written to a parameters file.


Eos: Real-Time

Using the determined parameters, the real-time subunit processes one second of the most recent EEG brain signal with spectral analysis, KPCA, and SVM. For every 150ms iteration, a position value between 0 and 999 (corresponding to hand completely open or closed respectively) is determined based on the desired position  calculated by the machine learning algorithm.


Arduino Bluetooth Controller

The control signal from the Eos software is sent wirelessly via Bluetooth to an Arduino Pro Mini (SparkFun Electronics; Boulder, CO) worn by the user. The desired hand position is then mechanically realized by forwarding this control signal to linear actuators attached to the exoskeleton hand orthosis.


Exoskeletal Hand Orthosis

The low profile hand orthosis system, known as ExoFlex, is a custom-designed hand exoskeleton for mechanical manipulation of the wearer’s fingers. It was designed with wearability, ease of attachment, and customizability in mind.  See Fig. 4 and Fig. 5 for a Computer Aided Design model and technical description. Because the ExoFlex is powered by a lithium-polymer battery, the ExoFlex and Arduino can be worn and operated without a power cord or restraining cables for over two hours.

Four independent finger-splints made from 3D-printed joints allow finger flexion and extension controlled by pulling Bowden cables.

Figure 4 - Front View of ExoFlex. From the distal finger-tip toward the hand, the mechanical component consists of finger caps that connect to finger chains. These chains run along each finger and all connect to a central hub on the dorsal side of the hand. At each of the most proximal links in the chain, two thin wires are inserted into each finger chain through two holes. These holes continue through all links down with one wire positioned over the axis of rotation for each link, and one under. By applying tension to either the top or bottom wire, the individual exoskeletal chains flex or extend to manipulate the wearer’s fingers. The wires are tensioned by a set of linear actuators (Firgelli Linear Motion Series L16) worn on the arm and controlled by the Arduino. Tensional force is transmitted from the linear actuators to the chains with a Bowden cable (compressionless outer sheath with force transmitting wire contained inside), just like a bicycle brake system.

Bowden cables responsible for hand orthosis movement are controlled by linear actuators.

Figure 5 - Rear View of ExoFlex. Springs attached to the finger chains link prevent slack from forming between the device and the wearer's fingers.



Eos Machine Learning Software

The performance of Eos was analyzed through 10-fold-cross-validation. This method involves training the system with 90% of the data from the signal file ten times.  Average accuracies for each subject can be seen in Table 1.

Mean accuracy of the Eos Machine Learning Software over 5 subjects was 93.6% with range 90% to 98%.

Table 1: Accuracy of the Eos Machine Learning Software. For each iteration, accuracy is determined by testing the trained system with the remaining 10% of the data. A mean accuracy was calculated from these 10 tests.



The ExoFlex component of IpsiHand was physically implemented from a CAD model by ordering a 3D printed version (composed of Nylon 12 plastic polymer) of the model from Shapeways (New York City, New York). 3D-printing every component that contacts the wearer’s hand enables easy customization of the orthosis’s dimensions to the intended user.


Physical implementation of ExoFlex orthosis as displayed by a group member.

Figure 6 - ExoFlex 3D printed device (two finger chains shown). The device attaches to the wearer's had via a standard wrist brace.


The chain link design of this orthosis is able to prevent hyperextension because each link rotates from level to 45 degrees of flexion allowing for natural range of motion during use. The possibility of in-home use, size (and color) customization, low-profile, and wearability of the orthosis makes it a very attractive orthosis system.


Long-term Therapy

At the time of this paper’s submission, an initial case study involving an individual stroke patient using the IpsiHand system had commenced. While the patient seemed to acquire reasonable control using our system, the results of this long term study will give insight into the rehabilitation potential associated with extended use of the IpsiHand system.



The Ipsihand Bravo design offers hope for rehabilitation for those who suffer brain injury. Through preliminary analyses, it is possible to achieve greater than 90 percent movement classification accuracy with this system.

As mentioned previously, Ipsihand Bravo is currently undergoing a medical case study at Washington University. The hope is that this single case study will not only lead to more study participants, but also to using Ipsihand as a common stroke and brain trauma therapy method.

Performance aside, Ipsihand will also prove an attractive therapy option because of its low cost.  As a comparison, most medical EEG systems can cost greater than $10,000 and require 30 to 60 minutes to set up. This system is designed for a 5 to 15 minutes setup, allowing for more therapy in place of set up time. If produced in volume, we estimate a cost of around $1100 per unit (Table 2), which even at retail price would be significantly cheaper than alternative devices.


Total production cost of the Ipsihand Bravo is $1076.65.

Table 2: Ipsihand Bravo production cost


Future development of Ipsihand will be aimed at moving the signal processing and Eos Machine Learning Software to an on-board micro-computer, to increase portability and ease-of-use. Improving system accuracy and training speed will also maximize the time available for patient therapy.



We would like to especially thank Dr. Eric Leuthardt, our faculty mentor, as well as David Bundy and Elad Gilboa, our graduate student mentors, for their guidance. This work is supported in part by Washington University School of Engineering, and Emotiv Systems.



[1] D. Broetz and e. al., “Combination of brain-computer interface training and goal-directed physical therpy in chronic stroke: A case report,” J Neurorehab and Neural Repair, p. 2010, 674-679.
[2] V. Roger, “Heart Disease and Stroke Statistics -2011 Update: A Report From the American Heart Association,” Circulation: Journal of the American Heart Association, 2011.
[3] H. e. a. Jorgensen, “Outcome and time course of recovery in stroke: Part II: Time course of recovery. The copenhagen Stroke Study,” Archives of Physical Medicine and Rehabilitation, pp. 406-412, 1995.
[4] L. Oujamaa, I. Relave, J. Froger, D. Mottet and J. Pelissier, “Rehabilitation of Arm Function After Stroke,” Annals of Physical and Rehabilitation Medicine, vol. 52, no. 3, pp. 269-293, 2009.
[5] C. Takahashi, “Robot-based hand motor therapy after stroke,” Brain, pp. 425-437, 2008.
[6] E. Leuthardt, G. Schalk, D. Moran and J. Ojemann, “The emerging world of motor neuroprosthetics: a neurosurgical perspective,” Neurosurgery, vol. 58, no. 1, pp. 1-14, 2006.
[7] J. Kimberly, Wineski and e. al., “Unique cortical physiology associated with ipsilateral hand movements and neuroprosthetic implications,” Stroke, pp. 3351-3359, 2009.
[8] Millán J D et. al. “Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges,” Front. Neurosci. pp. 161. 2010
[9] HI Krebs, et al., “A Paradigm-Shift: Rehabilitation Robotics,” IEEE Engineering in Medicine and Biology, vol. 7, 2008.


Mark Wronkiewicz   mdw4@cec.wustl.edu

, , , ,

Powered by WordPress. Designed by WooThemes

Skip to toolbar