Schuyler Eldridge has been awarded the 2012 CELEST/CompNet Prize in Computational Neuroscience for his work on Biologically inspired hardware for autonomous robots in collaboration with Florian Raudies, Ajay Joshi, and Max Versace.
As land and aerial robots fulfill their exploratory potential, they develop a need for both a high degree of autonomy and an ability to process large volumes of sensor data under power and payload restrictions. To operate autonomously, robots must be able to respond to changes in their environment and learn the most appropriate actions. However, this must be accomplished with regard to the second point; traditional approaches to autonomous tasks, such as navigation, require power hungry and heavy processing elements. The human brain is able to process large volumes of sensor data in a low power envelope while carrying its computation engine with it. We're addressing the need for on-robot processing of visual data for navigation purposes using biologically-inspired algorithms and low-power, under 20 watt Field Programmable Gate Array (FPGA) hardware.
More specifically, we're looking at biologically-inspired computation and processing of optic flow to determine the distance and direction of walls in the environment of a robot. While this work is in progress, one example function the FPGA currently performs is image pre-processing using biologically-inspired Gabor filters. On the left is the original image from our camera and the right is the feature enhanced version.
After further processing and by deriving a state space from this information and applying punishment for collisions, our simulation results show that a robot effectively learns to turn away from walls and reduce its number of collisions per unit time.