Technical Articles

The Brain-Computer Interface: Using MATLAB and Simulink for Biosignal Acquisition and Processing

By Günter Edlinger, g.tec Medical Engineering GmbH and Christoph Guger, g.tec Medical Engineering GmbH


Beethoven was only 27 when he began to lose his hearing. While music historians assert that his deafness made him an even greater composer by isolating him and driving his creativity, this profound sensory loss brought a naturally sociable musician to the brink of despair.

What if Beethoven had been able to communicate directly with an electronic piano that could interpret his brain signals and record his compositions?

A brain-computer interface (BCI) provides an alternative communication channel between the human brain and a computer by using pattern recognition methods to convert brain waves into control signals. BCI researchers hope to improve quality of life for people who are paralyzed or sensory-impaired. Using improved measurement devices, computer power, and software, multidisciplinary research teams in medicine, psychophysiology, medical engineering, and information technology are investigating and realizing new noninvasive methods to monitor and even control human physical functions.

g.tec Medical Engineering GmbH, an Austrian company, has developed a real-time biosignal processing platform and several devices that support noninvasive and minimally invasive monitoring of eye movements and brain, heart, and muscle activity. These futuristic tools are built on a foundation that combines the multipurpose data processing functionality of MATLAB and the real-time system development capabilities of Simulink.

Real-Time Brain Signal Classification

A BCI must be flexible to adapt to specific patient needs and also to execute in real time. The g.tec BCI, called g.BCIsys, based on the rapid prototyping capabilities of MATLAB and Simulink, supports rapid iteration and adaptation of software components, implementation of signal processing algorithms for online biosignal analysis and signal conditioning for a range of biomedical signals, and fast, accurate data acquisition. Online biosignal analysis enables researchers to compute, present, and store signal parameters during recording with minimal time delay.

g.BCIsys includes g.USBamp, a DSP-based biosignal acquisition system with 24-bit resolution that provides signal-conditioning functionality to amplify, filter, and convert electrode outputs into digital values. Its user-selectable, multichannel modules allow the simultaneous recording of electroencephalogram (EEG), electromyograph (EMG), electrooculogram (EOG), and electrocardiogram (ECG) data. Add-on g.BCIsys modules perform real-time analysis in conjunction with data acquisition. The BCI control signals can then be used for psychological and physiological experiments, and for rehabilitation engineering applications, such as orthotic and prosthetic device control. (See The Musical Brain Cap to learn about a musical BCI application.)

How the g.tec BCI Works

The g.tec BCI measures brain activity using data captured from specific regions of the head. For example, to measure an EEG, the researcher attaches small gold, silver, or silver chloride electrodes to the test subject's scalp. Researchers then use g.BCIsys to amplify the microvolt-level brain signals, perform the analog-to-digital conversion, and transfer the acquired EEG via a USB 2.0 interface to a PC or notebook for analysis. g.BCIsys includes modules, running in Simulink, that recognize and classify specific EEG patterns in real time or high-speed mode to convert the raw EEG data into control signals.

To more easily implement different signal processing procedures and control strategies for BCI implementations, the biosignal data acquired by g.USBamp is accessible in MATLAB using the Data Acquisition Toolbox (Figure 1) or in Simulink via a Simulink S-function block. Signal sampling rates can range from 1 Hz up to 38k Hz, with digital high-pass and low-pass filtering, and support for signals ranging from microvolts up to +/-250 mV.

brain_ci_fig1_w.gif
Figure 1. (Left) Data Acquisition Toolbox syntax used to communicate with amplifier in MATLAB. (Right) Softscope application used to analyze live amplifier data. Click on image to see enlarged view.

Thought Control: Moving a Cursor

A stimulation unit (g.STIMunit) extends the g.BCIsys by presenting the classification result to the subject or sending control commands to external devices (such as orthotics). Using the g.STIMunit, a test subject can learn to use the g.BCIsys BCI to move a cursor (represented by a horizontal bar) on a computer display by imagining hand or foot movements. This approach is based on the fact that imagery of the movement of different limbs causes changes in oscillatory EEG activity over sensorimotor areas of the cerebral cortex. These signal changes can be classified by weighting spectral parameters of different frequency bands for different electrode positions.

To control a cursor, the EEG is measured on the surface of the scalp via electrodes overlaying the central brain regions. If the test subject imagines a right-hand movement, there is a change in the EEG over the left hemisphere (measured at an electrode position called C3). If the subject imagines a left-hand movement, the EEG over the right hemisphere changes (at electrode position C4).

Using g.STIMunit, a test subject can practice changing these specific EEG patterns during a pretest training phase. g.STIMunit shows a horizontal bar on the screen. When the subject imagines a right-hand movement, causing a change in the C3 electrode pattern, the g.STIMunit software extends the horizontal bar to the right side of the screen. When the subject imagines a left-hand movement, causing a change in the C4 electrode pattern, then the horizontal bar is extended to the left side of the screen.

While the subject learns how to change the EEG signal through mental imagery, the BCI must also be trained to recognize the relevant changes in the electrode signals. The diagram in Figure 2 shows the BCI implementation for this experiment.

brain_ci_fig2_w.gif
Figure 2. BCI design. Click on image to see enlarged view.

Signal Processing and Rapid Prototyping

To train the BCI, researchers must extract features from the EEG signals by estimating the power distribution of the EEG in predefined frequency bands. The g.BCIsys BCI implements two band-power estimation methods:

  • The Simulink BandPower block estimates the power in two frequency bands: the alpha band (8–13 Hz) and the beta band (16–24 Hz).
  • The Simulink AAR Parameters block estimates model parameters of an adaptive autoregressive model based on the recursive least-square method.

In this example, the Simulink BandPower block performs online signal analysis to classify the EEG data in two distinct classes: RIGHT and LEFT. The BCI practice software then uses this classification to extend the horizontal bar (Figure 3). The direction in which the bar moves indicates the signal classification RIGHT or LEFT. The length of the bar indicates the reliability of the classification.

brain_ci_fig3_w.gif
Figure 3. On the left, the BandPower feature-extraction block provides a display of the extracted features for classification. Below, two EEG traces record the left and right brain hemispheres along with timing information. The other panels display alpha and beta frequency band analysis. Click on image to see enlarged view.

BCI Accuracy

The BCI was field-tested on 300 subjects ranging in age from 15 to 65 years. The subjects trained the computer and then tried to control the cursor bar on the computer screen (Figure 4). Each subject repeated the movement imageries 40 times before the error rate was computed. Nearly 93% of subjects controlled the BCI with about 60% accuracy, while 7% of subjects operated the BCI with more than 90% accuracy.

Laboratory and hospital research scientists worldwide use the g.tec biosignal processing platform for biosignal acquisition and processing. The software and hardware meet stringent medical requirements, and the flexible, customizable development platform makes it useful for many research and development applications. 

brain_ci_fig4_w.gif
Figure 4. A human thought causes the horizontal cursor bar to move to the left. The EEG traces and a trigger channel are displayed at top right, with the BCI model running the feedback paradigm in the lower right corner. Click on image to see enlarged view.

The Musical Brain Cap

brain_ci_cap_w.gif

A musical composer himself, Professor Eduardo Reck Miranda from the University of Plymouth in the UK is developing a brain-computer music interface (BCMI) that composes and performs musical pieces on an automated piano using information extracted from specific altered brain signals.

The musical brain cap uses the g.tec BCI to extract EEG features from test subjects. Band power from different frequency bands controls a generative system that composes music on the fly based on rules of musical grammar. The Simulink Hjorth block extracts EEG signal complexity and uses it to control the tempo of the musical piece (the more complex the signal, the faster the music) (Figure A).

The generative system composes sequences comprising short sections of music, generated using one of four grammatical rules selected based on the subject's mental activity. For example, if the subject is in a state of deep relaxation, rule 1 is selected; if the subject is in relaxed wakefulness, rule 2 is selected; and so on (Figure B).

The musical generative system output is then connected via a MIDI interface to a mechanical piano that plays the composed musical piece.

brain_ci_fig5_w.gif
Figure A. Simulink model displaying a raw EEG signal and Hjorth parameters estimating signal activity, mobility, and complexity. Click on image to see enlarged view.
brain_ci_fig6_w.gif
Figure B. Selection of musical grammatical rules based on the band power distribution in different frequency bands. Click on image to see enlarged view.

Published 2006

View Articles for Related Capabilities