How AI-powered brain implants are helping an ALS patient communicate

Nearly a century after German neurologist Hans Berger pioneered the mapping of human brain activity in 1924, researchers at Stanford University have designed two tiny brain-insertable sensors connected to a computer algorithm to help translate thoughts to words to help paralyzed people express themselves. On August 23, a study demonstrating the use of such a device on human patients was published in Nature. (A similar study was also published in Nature on the same day.)

What the researchers created is a brain-computer interface (BCI)—a system that translates neural activity to intended speech—that helps paralyzed individuals, such as those with brainstem strokes or amyotrophic lateral sclerosis (ALS), express their thoughts through a computer screen. Once implanted, pill-sized sensors can send electrical signals from the cerebral cortex, a part of the brain associated with memory, language, problem-solving and thought, to a custom-made AI algorithm that can then use that to predict intended speech. 

This BCI learns to identify distinct patterns of neural activity associated with each of the 39 phonemes, or the smallest part of speech. These are sounds within the English language such as “qu” in quill, “ear” in near, or “m” in mat. As a patient attempts speech, these decoded phonemes are fed into a complex autocorrect program that assembles them into words and sentences reflective of their intended speech. Through ongoing practice sessions, the AI software progressively enhances its ability to interpret the user’s brain signals and accurately translate their speech intentions.

“The system has two components. The first is a neural network that decodes phonemes, or units of sound, from neural signals in real-time as the participant is attempting to speak,” says the study’s co-author Erin Michelle Kunz, an electrical engineering PhD student at Stanford University, via email. “The output sequence of phonemes from this network is then passed into a language model which turns it into text of words based on statistics in the English language.” 

With 25, four-hour-long training sessions, Pat Bennett, who has ALS—a disease that attacks the nervous system impacting physical movement and function—would practice random samples of sentences chosen from a database. For example, the patient would try to say: “It’s only been that way in the last five years” or “I left right in the middle of it.” When Bennett, now 68, attempted to read a sentence provided, her brain activity would register to the implanted sensors, then the implants would send signals to an AI software through attached wires to an algorithm to decode the brain’s attempted speech with the list of phonemes, which would then be strung into words provided on the computer screen. The algorithm in essence acts as a phone’s autocorrect that kicks in during texting. 

“This system is trained to know what words should come before other ones, and which phonemes make what words,” Willett said. “If some phonemes were wrongly interpreted, it can still take a good guess.”

By participating in twice-weekly software training sessions for almost half a year, Bennet was able to have her attempted speech translated at a rate of 62 words a minute, which is faster than previously recorded machine-based speech technology, says Kunz and her team. Initially, the vocabulary for the model was restricted to 50 words—for straightforward sentences such as “hello,” “I,” “am,” “hungry,” “family,” and “thirsty”—with a less than 10 percent error, which then expanded to 125,000 words with a little under 24 percent error rate. 

While Willett explains this is not “an actual device people can use in everyday life,” but it is a step towards ramping up communication speed so speech-disabled persons can be more assimilated to everyday life.

“For individuals that suffer an injury or have ALS and lose their ability to speak, it can be devastating. This can affect their ability to work and maintain relationships with friends and family in addition to communicating basic care needs,” Kunz says. “Our goal with this work was aimed at improving quality of life for these individuals by giving them a more naturalistic way to communicate, at a rate comparable to typical conversation.” 

Watch a brief video about the research, below:

Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here

Related Posts
James Webb Uzay Teleskobu, İlk Sinyalini Tespit Etti thumbnail

James Webb Uzay Teleskobu, İlk Sinyalini Tespit Etti

Evrenin sırlarını, hatta belki de Dünya dışı yaşamı keşfetmemize yardımcı olacak olan James Webb Uzay Teleskobu, gözlem noktasına ulaştıktan sonra ilk sinyalini tespit etti. Teleskoptan ilk görüntüler birkaç ay sonra gelecek. 24 Aralık 2021’de uzayın derinliklerine fırlatılan James Webb Uzay Teleskobu, 1 aylık bir yolculuğun ardından kısa bir süre önce Dünya’dan 1,5 milyon kilometre uzaktaki…
Read More
First launch by ABL Space Systems fails shortly after liftoff thumbnail

First launch by ABL Space Systems fails shortly after liftoff

ABL’s RS1 rocket lifts off from Kodiak Island, Alaska, on Wednesday. Credit: ABL Space Systems ABL Space Systems’ first RS1 rocket fell back on its launch pad at Kodiak Island, Alaska, shortly after liftoff Tuesday on the company’s first orbital launch attempt, destroying the rocket and damaging the ground facility, officials said. “After liftoff, RS1
Read More
Planetary Debris Spotted around 11-Billion-Year-Old White Dwarf thumbnail

Planetary Debris Spotted around 11-Billion-Year-Old White Dwarf

WD J2147-4035, an 11-billion-year-old white dwarf located 91 light-years away in the constellation of Grus, is accreting debris from orbiting planetesimals, making it one of the oldest rocky and icy planetary systems discovered in our Milky Way Galaxy. An artist’s impression of the white dwarfs WD J2147-4035 and WD J1922+0233 surrounded by orbiting planetary debris, which
Read More
A Year Lasts 16 Hours on This Ultrahot, Jupiter-like Planet thumbnail

A Year Lasts 16 Hours on This Ultrahot, Jupiter-like Planet

A newly-detected gas giant roughly 800 light years from Earth swings around its star so quickly that a year passes in 16 hours. The trouble is nobody would last long enough to get old even on TOI-2109b’s own terms — the Jupiter-like gas giant is blazing hot.  “The temperatures on this planet even exceed those of some stars,” says Ian Wong, an astronomer at NASA Goddard Space Flight Center in…
Read More
Everything you need to know about flying taxis thumbnail

Everything you need to know about flying taxis

This article was produced by National Geographic Traveller (UK).The chance to soar over traffic in some of the world’s great metropolises sounds like a vision of the future that’s been a long while coming. And now, that vision is about to become reality in major cities such as Dubai and New York. Most recently, the UK’s
Read More
Misogynistic attitudes towards women's sport among male football fans thumbnail

Misogynistic attitudes towards women’s sport among male football fans

Openly misogynistic attitudes towards women's sport may be common amongst male football fans, according to new research involving online message boards. The Durham University-led study, based on a survey of 1,950 male football fans on UK football fan message boards, found openly misogynistic attitudes towards women's sport among those surveyed, regardless of their age. Progressive…
Read More
Index Of News
Total
0
Share