Android Devices and Neuromorphic Camera

Julien Martel, Institute of Neuroinformatics (INI), ETH-Zurich

Nicolas Oros, Cognitive Anteater Robotics Laboratory (CARL), University of California Irvine

Introduction & Context

Sixty years ago, computers like the ENIAC (1943-1946) used to fill a whole room (1,000 square feet) to perform not more than 5000 operations per second. The first personal computers appearing in the 70's were a revolution and were soon followed by mass production machines like the Commodore-64 (1982) sold at 22 millions units or the Amiga-1000 (1984). In the last decade, computing devices became tremendously smaller and mobile processors that equip nowadays' cellphones are order of magnitudes more powerful than the aforementioned first personal computers and as --if not more-- powerful as computers built at the beginning of the century. They offer relatively fast and general purpose computation at a low energy cost, therefore meeting the needs of robotic platforms. While the field of robotics is continuously expanding at a remarkable rate and better performing robots are created every year, robotics still remains out of reach for many students and researchers. The main reasons are the high complexity of the hardware and software used in robots, and their typically high cost. We believe that the computing power, sensing capabilities and intuitive programming interfaces of modern smartphones afford an inexpensive yet highly capable robotic platform.


The objective of this sub-project of the Neuromorphics Olympics group is to provide a framework to interface the recently developed neuromorphic dynamic vision sensors like the DVS128, the eDVS4337, the ATIS or the DAVIS with mobile android platforms. These sensors produce asynchronous events at a very low power and allow very efficient event based computation particularly suited for robotics applications.


Android is a widely used operating system developed by Google, it equips 85% of mobile phone operating systems. In the following we present pros & cons of solutions that we envisioned to interface the eDVS4337 to android platforms. Finally we detail the development of the solution we adopted.

Using the Android SDK with the IOIO board

Using the Android NDK (Native Development Kit)

Our solution: using the Android SDK (Software Development Kit) with the FTDI chip constructor library

For this project, we used a Samsung Galaxy S3 connected to an eDVS4337 using a OTG (On-the-go) cable. Connection setup:

  • Phone <--> micro USB male to USB Female OTG cable <--> USB male to mini USB male <--> eDVS

In order to read the sensory information sent by the eDVS from the Android phone, we used the JAVA D2xx driver and library provided by FTDI Chip. D2xx is used for FTDI USB to UART devices interaction.

We created a Android Application Project using Eclipse and the Android SDK. The app can read the events sent by the eDVS at a 4 Mbauds rate using the D2xx library. The events are then mapped to an image for display.

The app is composed of a main activity (gui) used for display, a thread used for communicating with the eDVs, and another class used to parse the data.

The fully commented source code of our project (including the D2xx library) can be found on Github (Note: the app still needs work/optimization):

The D2xx library and documentation can be found here:

A FTDI UART Terminal app is also available here: