Predicting voluntary movements from motor cortical activity with neuromorphic hardware

Abstract
Neurons in the mammalian motor cortex encode physical parameters of voluntary movements during planning and execution of a motor task. Brain–machine interfaces can decode limb movements from the activity of these neurons in real time. One future goal is to control prosthetic devices in severely paralyzed patients or to restore communication if the ability to speak or make gestures is lost. Here, we implemented a spiking neural network that decodes movement intentions from the activity of individual neurons recorded in the motor cortex of a monkey. The network runs on neuromorphic hardware and performs its computations in a purely spike-based fashion. It incorporates an insect-brain-inspired, three-layer architecture with 176 neurons. Cortical signals are filtered using lateral inhibition, and the network is trained in a supervised fashion to predict two opposing directions of the monkey's arm reaching movement before the movement is carried out. Our network operates on the actual spikes that have been emitted by motor cortical neurons, without the need to construct intermediate nonspiking representations. Using a pseudo-population of 12 manually selected neurons, it reliably predicts the movement direction with an accuracy of 89.3% on data not encountered during model training, after only 100 training trials. Our results provide a proof of concept for the first-time use of a neuromorphic device for decoding movement intentions.
Funding Information
  • Deutsche Forschungsgemeinschaft
  • DFG (SCHM2474/1-2)
  • Marie Curie Intra-European Fellowship
  • EU FP7 (331892)
  • German Israeli Foundation (I-1224-396.13/2012)
  • University of Heidelberg
  • EU FP7 (604102)