Live&Learn

GUIDELINE TO DEVELOP BRAIN COMPUTER INTERFACE

Introduction

History of BCI Technology

Brain computer interface (BCI) began in 1924 (Brain Computer Interface Wikipedia, 2013), the German neurologist name Han Barger (Hans Berger Wikipedia, 2013) succeed to record human brain activity by means of Electroencephalography (EEG). Brain cells communicate with each other by producing tiny electrical signals, called impulses.

(Hans Berger Wikipedia, Robot Suit HAL: Cyberdyne Inc)

Brain computer interface is a direct interface for brain activity to an external device. In first generation of EEG technology, people use EEG system in medical field such as helping diagnose health conditions and helping paralyzed people to do activity.

In few years ago (2013), there are companies have developed low cost BCI system. They turn BCI technology into toys and gaming devices such as Necomimi, NeuroSky, MindFlex and Emotiv neuroheadset.

(MindFlex neuro toy, Necomimi Cat ear)

Problem Statement

Even though EEG device is cheaper than 2000s, developing a BCI application is very hard because developer need a lot of knowledge to understand the meaning of brain wave and solution to classify them. So this report guidelines the basic of developing BCI application.

Objective

• To understand the architecture of BCI system.
• To understand the meaning of brain wave.
• To understand the meaning of 14 channel of Emotiv EEG receiver.
• To understand how to develop BCI application on PC with Emotiv EEG device.
• To understand modern classification algorithm in order to apply to brain wave.
• To review methodology of successful BCI.

LITERATURE REVIEW

Normal EEG Waveforms

The normal Electroencephalography (EEG) (Electroencephalography Wikipedia, 2013) of childhood is different from adulthood. In generally, the adulthood EEG has faster frequency oscillations than the childhood EEG. EEG can be captured at every nerve of the body. When a person attempts to move, nerve signals are sent from the brain to the muscles via motoneurons. At this moment, very weak EEG signals can be detected on the surface of the skin. The frequency of EEG rhythmic is in the range of 1–100 Hz. Each range of EEG frequency such as delta, theta and alpha can give information about the state of the brain. There are 5 band of normal EEG rhythmic: Delta, Theta, Alpha, Beta and Gamma

1. Delta wave (up to 4 Hz): can be founded at frontally in adults during deep sleep. It is also seen normally in babies.

(Delta wave)

2. Theta wave (4-7 Hz): can be founded in locations not related to hand task during drowsiness or arousal in older children and adults. Theta is seen normally in young children and also be seen in meditation.

(Theta wave)

3. Alpha wave (7-14 Hz): can be founded posterior regions of head and center of the head during relaxed, reflecting and closing eye.

(Theta wave)

4. Alpha wave (7-14 Hz): can be founded both side of head during active, busy, anxious thinking and active movement

(Theta wave)

5. Gamma wave (30-100 Hz) ): can be founded during peak concentration and extremely high levels of cognitive functioning.

(Theta wave)

Architecture of BCI systems

There are three major physical components in BCI system (R. Scherer, G.R. Müller-Putz, and G. Pfurtscheller, 2009).

1. Signal Acquisition: Converts electrode signal into digital numeric values that can be manipulated by a computer.
2. Signal Processing: Analysis and classify EEG data.
o Data Pre-processing: Prepares raw data for further processing.
o Feature Extraction: Select useful data for training and classification.
o Data Classification: Translate data into useful information such as computer command.
3. Device Receiver: Respond to the command from the Signal Processing.

(BCI system diagram)

For the Signal Acquisition, there are 2 types of EEG headsets and head caps (Michael Adelson, 2011). First, Single electrode headsets such as the Neurosky Mindwave (Nurosky, 2013), are very simple and inexpensive but the data from single electrode is not enough to classify complex commands. Second, multiple-electrode headsets such as the Emotiv EPOC (Emotiv, 2013) and standard EEG head cap.

(Neurosky Mindwave, Standard EEG head cap:BioSemi)

(EEG headset/head caps comparison)

Finally, the Emotiv EEG neuroheadset was chosen as the signal acquisition device because it cost is not too high, it had multiple electrodes (14 channel) and it provides Software development kit (SDK) for developing BCI application on PC. This neuroheadset has14 electrodes located over 10-20 international system positions (10-20 system (EEG) Wikipedia, 2013) AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4.

(Emotiv EEG neuroheadset sensor location)

For Signal Processor and Device Receiver, currently I use my laptop as signal processor and device receiver. Emotiv EEG neuroheadset provide wireless USB that can plug into PC for receiving EEG data from the neuroheadset. In section Functional of brain in each Emotiv EEG sensor, it describes how to develop BCI application on PC with Emotiv EEG device.

Functional of brain in each Emotiv EEG sensor

Quantitative Electroencephalography (QEEG) (Dr. Horst H, 2013) is a brain imaging technique that allows us to understand an individual's electrical brain activity and brain function. This research gives us useful information to understand the meaning of each Emotiv EEG sensor.

AF3: Attention
AF4: Judgment
F3 : Motor planning
F4 : Motor planning for left upper
F7 : Verbal Expression
F8 : Emotional Expression: Anger, Happy
FC5: Right Body controller

FC6: Left Body controller
T7 : Verbal memory
T8 : Emotional memory
P7 : Verbal understanding
P8 : Emotional: Understanding, Motivation
O1 : Visual processing
O2 : Visual processing

Simple EEG Classification

Detecting Close eye brain wave pattern

The functional of human brain at the sensor O1 and O2 are visual processing (Michael Adelson, 2011). During close eye, O1 and O2 sensor received a different pattern of brain wave. So we can classify the close eye movement by using this pattern.

(Brain wave during close eye at O1 and O2 sensor)

During blink eye, AF3 and AF4 sensor received a different pattern of brain wave (Michael Adelson, 2011). So we can classify the blink eye by using this pattern.

(Brain wave during blink at AF3 and AF4 sensor)

How to develop BCI application on PC with Emotiv EEG neuroheadset

Emotiv SDK is a suite of development tool to help developer to create BCI application on PC and Mac. It includes the following
• Emotiv Control Panel: Program for setting and testing the headset.
• EmoEngine (EDK.dll): C, C++ and C# library for processing EEG data from the neuroheadset.

System Requirements

These are the minimum hardware and software requirements for developing BCI applications

• Microsoft Windows XP with Service Pack 2, Windows Vista or Windows 7/8
• CPU 2.4 GHz Intel Pentium 4 processor (or higher)
• 1GB RAM.
• One USB 2.0 ports
• Visual Studio 2010 (or higher)

Emotiv SDK provide the library develop BCI application in C, C++ and C#. I choose C# to develop the BCI application because I am familiar to it.

Setup Environments

1. Install Emotiv SDK and Visual Studio 2010 or higher into PC.
2. Prepare Emotiv EEG neuroheadset.

Emotiv EEG neuroheadset requires user to wet its foam-tipped sensors with contact lens solution before using it then insert each one into the black plastic headset arms.

(Prepare Emotiv EEG neuroheadset)

3. Paring the neuroheadset to the USB receiver, and then ware it in the right position.

(Prepare Emotiv EEG neuroheadset)

Testing the neuroheadset signal strength with Emotiv Control Panel

After connected Emotiv neuroheadset to the USB receiver, open Emotiv Control Panel to check the signal strength. If any sensor is not good enough, drop contact lens solution to that sensor until the signal strength is good. There are 5 colors of signal quality:

1. Black: No signal
2. Red: Very poor signal
3. Orange: Poor signal

4. Yellow: Fair signal
5. Green: Good signal

(Emotiv Control Panel )

Tutorial how to create BCI application on PC with Emotiv EEG neuroheadset

1. Create Windows Form application project in Visual C# Studio.
2. Import DotNetEmotivSDK into References.

using Emotiv;

3. Add initial Emotiv method to the Class

void EEG_Starter(){
    // create the engine
    engine = EmoEngine.Instance;
    engine.UserAdded += new EmoEngine.UserAddedEventHandler
    (engine_UserAdded_Event);
    // connect to Emoengine.            
    engine.Connect();
}

4. Add event Handler when user connect Emotiv headset to PC

void engine_UserAdded_Event(object sender, EmoEngineEventArgs e)
{    // record the user 
    userID = (int)e.userId;

    // enable data aquisition for this user.
    engine.DataAcquisitionEnable((uint)userID, true);

    // ask for up to 1 second of buffered data
    engine.EE_DataSetBufferSizeInSec(1);
}

5. Setup Timer and add code for retrieving data from EEG device for processing

void Run()
{    // Handle any waiting events
engine.ProcessEvents();
if ((int)userID == -1) return;
Dictionary<EdkDll.EE_DataChannel_t, double[]> data = engine.GetData((uint)userID);
// If no data, do not proceed
if (data == null) return;

int _bufferSize = data[EdkDll.EE_DataChannel_t.TIMESTAMP].Length;
    for (int i = 0; i < _bufferSize; i++)
    {
        double AF4 = data[EdkDll.EE_DataChannel_t.AF4][i];
        double F8 = data[EdkDll.EE_DataChannel_t.F8][i];
        double F4 = data[EdkDll.EE_DataChannel_t.F4][i];
        double FC6 = data[EdkDll.EE_DataChannel_t.FC6][i];
        double T8 = data[EdkDll.EE_DataChannel_t.T8][i];
        double P8 = data[EdkDll.EE_DataChannel_t.P8][i];
        double O2 = data[EdkDll.EE_DataChannel_t.O2][i];
        double O1 = data[EdkDll.EE_DataChannel_t.O1][i];
        double P7 = data[EdkDll.EE_DataChannel_t.P7][i];
        double T7 = data[EdkDll.EE_DataChannel_t.T7][i];
        double FC5 = data[EdkDll.EE_DataChannel_t.FC5][i];
        double F3 = data[EdkDll.EE_DataChannel_t.F3][i];
        double F7 = data[EdkDll.EE_DataChannel_t.F7][i];
        double AF3 = data[EdkDll.EE_DataChannel_t.AF3][i];
        double time = data[EdkDll.EE_DataChannel_t.TIMESTAMP][i];
        double Sum = AF4 + F8 + F4 + FC6 + T8 + P8 + O2 + O1 + P7 + T7 + FC5 + F7 + AF3;
        double Mean = Sum / 14.0;
    }
}
private void GetData_timer_Tick(object sender, EventArgs e)
{
    Run();
}

( Capture EEG Windows form, my research project ) )

After getting the EEG data into the program, we need to implement algorithm for classifying electroencephalographic. There are a lot of popular algorithms for classifying that I have research (Omar AlZoubi, Irena Koprinska, Rafael A, 2006).

• Decision Tree (DT)
• K Nearest Neighbor (k-NN)
• Naïve Bayes (NB)
• Ada Boost
• Support Vector Machine (SVM)

(Support Vector Machine Demo, CMSoft)

Successful BCI project

Brainwave-controlled helicopter developed at University of Minnesota

The objective of this project is to examine the impact of using BCI system on the real world device (Karl LaFleur, Kaitlin Cassady, Alexander Doud, Kaleb Shades, Eitan Rogin, Bin He, 2013). They do the experiment with five human subjects to control quad copter by using their brain wave.

Finding EEG data pattern.

They use BCI2000 development platform for initial training period to identify the specific electrodes and frequencies that were most differentially active during the actuation of a given imagination pair.

((a), (b) A different brainwave of the right and left hand imaginations as compared to rest. (c) A different brainwave of the imagination of both hands as compared to rest.)

As a result, they can detect the different brain wave during right and left hand imaginations between C3, C4 and other sensor. Then they turn the brain wave into computer commands to control their helicopter.

( (a), (b): Comparisons of spectral power in right and left hand imaginations at sensor C3 and C4 ,
(c), (d):Comparisons of spectral power in both hand and rest imaginations at sensor C3 and C4,
(e), (f):Comparisons of spectral power in left, right hand and rest imaginations at sensor C3 and C4.) )

From figure (a), (b) shown that, the spectral power when user imagine left hand is higher than spectral power when user imagine right hand at sensor C3 and the spectral power when user imagine left hand is lower than when user imagine right hand spectral power at sensor C4. C3 and C4 sensor received different spectral power when user imagines left and right hand. Spectral power when user imagines left in C3 sensor is higher than C4. In another hand, C4 sensor received higher spectral power than C3 when user imagines right.

From figure (c), (d shown that, when user imagine both hand, spectral power at C3 sensor will equal to C4 sensor.

Then they use the spectral power from C3 sensor to observe right hand, use the spectral power from C4 sensor to observe lift and use both spectral power from C3 and C4 sensors to observe both hand.

( Changing spectral power into command for controlling helicopterf.) )

Watch more on Youtube


Grab the documents here!

Visual Studio 2012 EEG Capture Project

Slide

Paper

“The brain is wider than the sky.” - Emily Dickinson

0