Jorg Conradt, Chris Eliasmith, Ralph Etienne-Cummings, Francesco Galluppi, Timmer Horiuchi, Terry Stewart, Jonathan Tapson, Bryan Tripp
In this work, Nengo is used to construct the Theta and Place cells. Theta cells are implemented as ring oscillators to facilitate the selection of specific phases of the oscillations. An array of theta cells tuned to various speeds and direction are constructed and used to realize the Place cells. The Nengo networks are next compiled and implemented on SpiNNaker.
The model runs comprises 2700 neurons and 840400 synapses running on 36 cores (2 + 1/2 spiNNaker chips out of 4 in the board). Each SpiNNaker chip consumes up to 1W, therefore the 4 chip board consumes 4W + 1W for infrastructure, making it very appealing for embedded robotic applications. The robot actuators are directly coupled with the board through one of the 6 asynchronous links; the robot is considered by the system as another (virtual) SpiNNaker chip, and packets are sent through the interface and translated by a micro-controller.
Neural processing is performed by a 4-chip SpiNNaker board, oﬀering a digital event-driven platform that can interpret incoming events and translate them into neural spike trains. Each SpiNNaker chip is equipped with 1Gbit SDRAM and 18 programmable ARM968 cores embedded in a conﬁgurable packet-switched asynchronous network-onchip, based on an on-chip Multicast (MC) Router capable handling one-to-many communication of spikes (packets) very efficiently, and linked to 6 neighbour chips through asynchronous links.
In this work we use a 4 chip SpiNNaker board, which overall has a top consumption of 5W: 4W chips + 1W infrastructure. The robot actuators are directly coupled with the board through one of the 6 asynchronous links; the robot is considered by the system as another (virtual) SpiNNaker chip, and packets are sent through the interface and translated by a micro-controller.
The mobile robot used in this project is a custom developed omni-directional mobile platform of 26cm diameter, with embedded low-level motor control and elementary sensory systems. An on-board ARM7 microcontroller for robot control receives desired motion commands in x and y direction and rotation through a UART communication interface, and continuously adapts three motor control signals (PD-control) to achieve the desired velocities. The robot’s integrated sensors include wheel encoders for position estimates, a 9 DOF inertial measurement unit (3xAccelerometer, 3xGyroscope, and 3xCompass) and a simple bump-sensor ring which triggers binary contact switches upon contact with objects in the environment. The integrated battery pack allows up to 8h of autonomous robot operation; powering robot and spiNNaker simultaneously we estimate an autonomous run time of about 4h.
SpiNNaker communicates with the robot through a small customized interface board with an ARM Cortext microcontroller that translates spiNNaker packages into robot commands and vice versa. The interface board is currently under improvement to allow higher data rates, such that event based sensory systems (such as silicon retinas or cochleae) can get interfaced directly to spiNNaker. The overall system is stand-alone autonomous (no PC in the loop).