I think, therefore I move

Mind reading has come a long way from its ignominious origins alongside the likes of fortune telling and witchcraft. Scientists and medical doctors have made great strides in their ability to extract and interpret electromagnetic signals from the brain, and unlike mind readers of the past, they have very real practical gains to show for it. One notable success story is the cochlear implant, which is currently in use by nearly a quarter of a million deaf or hard-of-hearing patients.  (For a look at more state-of-the-art applications in the field, consider attending the upcoming California Cognitive Science Conference, featured on our blog last week by Chris Holdgraf).

The so-called brain-machine interface (BMI) technology has not yet been perfected to the point that we need to worry about hackers stealing our secrets or erasing our memories. But it has come far enough that researchers may soon be able to restore physical and sensory functionality to patients with immobilizing conditions such as paralysis and Parkinson’s Disease. Scientists at UC Berkeley and UCSF’s Center for Neural Engineering and Prostheses (CNEP) are among the pioneers in developing this sort of brain repair technology. Over the next few years, CNEP investigators hope to begin human clinical trials for neural prosthetics – robotic limbs that are controlled just like natural ones, by two-way communication with the brain. The center, which launched in December and is co-directed by UC Berkeley Professor Jose Carmena and UCSF neurosurgeon Edward Chang, consists of over a dozen research groups across engineering, medicine, and computer science (see the press release here). CNEP will use its team’s wide range of expertise to efficiently tackle the challenges that must be overcome for neural prosthetics to become a reality.

Edward Chang

Any doubts about the technological feasibility of CNEP’s goals have largely been erased by recent results from experiments with animals. Carmena spent part of the last decade working with Miguel Nicolelis at Duke University, where they implemented some of the first BMI technologies on primate subjects. In one groundbreaking experiment, they trained macaque monkeys to perform manipulations such as reaching and grabbing using a robotic arm controlled only by their thoughts. An array of electrodes was implanted into the monkeys’ brains to read their thoughts into a computer, while a projector screen gave visual feedback (their real arms did not move in the experiment and were hidden from their view by a neck brace). The study showed that, with practice, subjects could even control a real (as opposed to virtual) robotic arm once they had time to adapt to its non-idealities, like mechanical delay.

Current research at CNEP is focused on several key goals. One is to increase the number of simultaneous commands that can be distinguished by signal processing software so that the prosthetics can access the full complexity of natural human motion. Another is to improve the precision and accuracy of the hardware (i.e. the implanted electrodes) that detects the electrical brain signals and communicates them to a computer processor. And on the physiological side, work remains on understanding the process of neural plasticity, or how the brain rewires itself to incorporate the artificial limb into the natural thought process. While this sounds like a rigorous agenda, many of these fields are quite mature already. It is not far-fetched to believe that devices fit for human clinical trials will be ready in the near future.

Engineers and medical doctors have long enjoyed fruitful collaborations that have fueled progress in fast-paced research fields like biomedical engineering and nuclear medicine. But neural engineering? That’s a whole different ball game. Progress in this field will generate vast opportunities for researchers to benefit the lives of patients suffering from a variety of afflictions, not only physical ones.  Once they have finished coming up with solutions for serious neurological conditions, I do have a small question for them: when can I stop going to class and just download all of the world’s knowledge into my brain?

ResearchBlogging.orgCarmena JM, Lebedev MA, Crist RE, O’Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, & Nicolelis MA (2003). Learning to control a brain-machine interface for reaching and grasping by primates. PLoS biology, 1 (2) PMID: 14624244

Leave a Reply