The goal of this project was to control a robot using electroencephalogram (EEG) signals from the brain and then provide the controller with sensory information from the robot via vibrotactile feedback. To accomplish this goal we used a BCI2000 system to record EEG signals from a human subject and then extracted features from the EEG in the form of event-related potentials (ERPs). Using these ERPs, the subject (controller) was able to send movement commands to the robot.

Our choice of robot was a wheeled, omnidirectional robot capable of moving in any of the four cardinal directions on a plane. The robot was equipped with six bump sensors capable of detecting collisions and a camera capable of acquiring panoramic images. We relayed information from these sensors via a belt arrayed with 8 tactors that could be used to provide the controller with vibrotactile feedback.

We completed two subprojects:

1. Navigation of the robot through a maze using only information from the bump sensors.

2. Tracking of an object in space using a visual target tracking algorithm.

Brain Interfaces

  • BCI2000 System

The system included a multi-probe EEG caps and BCI2000 software to measure EEG signals.

Software Interfaces

  • EAI Tactor Software SDK

Used at the base of all other software used for controlling the tactor belt.

  • Matlab

Used to process visual signals acquired from the robot and relay the information to the tactor belt.

  • Custom C# code

Used to collect information from the bump sensors and relay the information to the tactor belt.

  • BCI2000

Used to collect and analyze EEG signals and to send the signals to control the robot.

Machine Elements

  • Omni-directional driving robots provided by the Spike-based Robotic Systems Workgroup.

The robot could be controlled wirelessly and was equipped with a panoramic camera, bump sensors.

  • EAI Tactor Belt

The belt was equipped with 8 tactors and could be controlled through either bluetooth or USB connections.