12 June 2016

Eyeball - A Minimal Neurorobot

This is my second post on personal neurorobotics. In the previous post I outlined the case for brain-based robots as consumer products, especially in education. In this post I will describe a minimal personal neurorobot that I call Eyeball. Eyeball implements a causal loop that is fundamental to animal (and neurorobot) behavior: signals travel from the brain to motors, causing behavior and change in the outside world, which is perceived by the brain as visual or other sensory feedback.


Eyeball is a web-camera attached to a servo motor. Eyeball's brain consists of two spontaneously spiking neurons that run on a USB-connected computer (eyeball.py, requires OpenCV, windows installation instructions here). One of the neurons is a motor neuron: whenever it spikes the motor moves to a new position. This changes the field of view of the web-camera, which continuously sends video frames back to the brain. The second neuron is a sensory neuron that is maximally activated by a dark spot on a white background. The spikes of the sensory neuron inhibit the motor neuron. This means that when Eyeball encounters a dark spot on a white background it stops moving and fixates on it.


Eyeball is very cheap. Web-cameras and servos can be bought for less than $5 each. To send serial commands via USB from Python to Eyeball I use an FTDI chip, which can also be purchased for around $5. I used an Arduino board to convert the serial commands to the PWM format needed to control the servo but only because I don't yet know how to send PWM commands from Python directly. So in principle the device costs less than $20. (Of course, to be successful Eyeball would also need a nice-looking plastic case.)

Despite the low cost, Eyeball has all the components needed to emulate some interesting brain functions and behavior. Vision is perhaps the best understood of all brain functions, and brain-based models of visual object recognition such as HMAX are already used to give neurorobots vision, but only in academia - not yet for consumer-oriented educational applications. Given the ability to recognize objects, Eyeball could moreover be trained to orient towards and track some objects and avoid others, ideally using a realistic implementation of the tectum/superior colliculus and basal ganglia. A reward-button could be used to deliver a dopamine-reward, changing synaptic weights according to known learning rules and thus training the robot to show preference for some objects. Easy access to the web-camera's microphone and the computer's speakers opens the door for voice communication... etc... etc...

While even this very simple neurorobot opens up a lot of interesting possibilities for implementing and exploring mechanistic models of the brain, what we really want is a robot that has two eyes and can move around independently. This will be the topic of the next post.

No comments: