This video shows a robot arm, the Manus ARM by Exact Dynamics, being controlled by recorded signals from a neuronal network. The neuronal network is comprised of mouse neurons in a petri dish, with signals observed using a multi-electrode array (MEA). In the video, two recorded channels are shown on either side of the robot arm. When the left channel bursts (goes above a specified threshold), the arm moves to the left. When the right channel bursts, the arm moves to the right. Current work is aimed at having live signals control the arm, including the use of feedback from a camera in the robot's gripper.
We have just started an interdisciplinary project in cooperation with UML neurobiology researchers. In the course of this work, our lab will be working on artificial neural net (ANN) models of brain damage and drawing inspiration from biological mechanisms of brain damage and recovery to attempt to optimize neural network training and minimize the computational resources required to use ANNs.
The neurobiology department will be working with cultured neural tissue in multi-electrode arrays (MEAs) to gather information on the biological processes underlying mTBI and possible methods of minimizing the damage and allowing the damaged brain to recover and restore functionality.
By comparing the responses of artificial and biological neural networks to damage and recovery, we hope to draw inspiration from the results of each others' work and improve the technologies used in AI and medicine.
This work is the beginning of a long-term project that will eventually apply the information learned about training cultured neurons and the robotics knowledge of our lab to allow the cultured neurons to control a robot arm in real time. This avenue of research has applications in assistive prosthetics.