Biomimetic Brain Machine Interfaces for the Control of Movement

Abstract
Quite recently, it has become possible to use signals recorded simultaneously from large numbers of cortical neurons for real-time control. Such brain machine interfaces (BMIs) have allowed animal subjects and human patients to control the position of a computer cursor or robotic limb under the guidance of visual feedback. Although impressive, such approaches essentially ignore the dynamics of the musculoskeletal system, and they lack potentially critical somatosensory feedback. In this mini-symposium, we will initiate a discussion of systems that more nearly mimic the control of natural limb movement. The work that we will describe is based on fundamental observations of sensorimotor physiology that have inspired novel BMI approaches. We will focus on what we consider to be three of the most important new directions for BMI development related to the control of movement. (1) We will present alternative methods for building decoders, including structured, nonlinear models, the explicit incorporation of limb state information, and novel approaches to the development of decoders for paralyzed subjects unable to generate an output signal. (2) We will describe the real-time prediction of dynamical signals, including joint torque, force, and EMG, and the real-time control of physical plants with dynamics like that of the real limb. (3) We will discuss critical factors that must be considered to incorporate somatosensory feedback to the BMI user, including its potential benefits, the differing representations of sensation and perception across cortical areas, and the changes in the cortical representation of tactile events after spinal injury.