With brain implants, scientists hope to translate paralyzed patients’ thoughts into speech
Via Stat
he brain surgeon began as he always does, making an incision in the scalp and gently spreading it apart to expose the skull. He then drilled a 3-inch circular opening through the bone, down to the thick, tough covering called the dura. He sliced through that, and there in the little porthole he’d made was the glistening, blood-flecked, pewter-colored brain, ready for him to approach the way spies do a foreign embassy: He bugged it.
Dr. Ashesh Mehta, a neurosurgeon at the Feinstein Institute for Medical Research on Long Island, was operating on his epilepsy patient to determine the source of seizures. But the patient agreed to something more: to be part of an audacious experiment whose ultimate goal is to translate thoughts into speech.
While he was in there, Mehta carefully placed a flat array of microelectrodes on the left side of the brain’s surface, over areas involved in both listening to and formulating speech. By eavesdropping on the electrical impulses that crackle through the gray matter when a person hears in the “mind’s ear” what words he intends to articulate (often so quickly it’s barely conscious), then transmitting those signals wirelessly to a computer that decodes them, the electrodes and the rest of the system hold the promise of being the first “brain-computer interface” to go beyond movement and sensation.
If all goes well, it will conquer the field’s Everest: developing a brain-computer interface that could enable people with a spinal cord injury, locked-in syndrome, ALS, or other paralyzing condition to talk again.
The technology needn’t give these patients the ability to deliver a Shakespeare soliloquy. More and more experts therefore think a system that decodes whether a person is silently saying yes or no or hungry or pain or water is now within reach, thanks to parallel advances in neuroscience, engineering, and machine learning.
“We think we’re getting enough of an understanding of the brain signals that encode silent speech that we could soon make something practical,” said Brian Pasley of the University of California, Berkeley. “Even something modest could be meaningful to patients. I’m convinced it’s possible.”
Further in the future, Facebook and others envision similar technology facilitating consumer products that translate thoughts into text messages and emails. No typing or Siri necessary.
The first brain-computer interfaces (BCI) read electrical signals in the motor cortex corresponding to the intention to move, and use software to translate the signals into instructions to operate a computer cursor or robotic arm. In 2016, scientists at the University of Pittsburgh went a step further, adding sensors to a mind-controlled robotic arm so it produced sensations of touch.