Control for brain-machine interfaces

 We are beginning to work with functional electrical stimulation (FES) of muscles, as well as prosthetic and assistive robot arms that must perform daily tasks under the control of a disabled user. We believe that the best approach to brain-machine interfaces is to obtain high-level user commands, either from brain activity [1] or from eye movements and speech, and put enough intelligence in the device itself so as to translate these commands into actual movements. Optimal control is well-suited for such translation. For example, our arm movement controller maps spatial targets to muscle activations for a detailed biomechancial model of the human arm – similar to what is needed in FES.