Casey Harrel, a 45-year-old Californian affected by ALS, experiences speaking for the first time in four years using a brain-computer interface (BCI). The BCI interprets his neural signals to reconstruct his voice, incorporating his intonation and style. To enable this, 256 electrodes were implanted in his cerebral cortex in areas controlling speech and other vocal tract muscles. The research team engages with Harrel to train the AI model, allowing real-time communication, significantly improving his interaction capabilities after he struggled to communicate prior to this technology.
The group has developed a brain-computer interface (BCI) that not only interprets what Harrel wants to say in real time but also picks up on his intonation and speaking style.
A total of 256 1.5-millimeter electrodes were implanted in Harrel's cerebral cortex, specifically in the region that controls speech, the jaw and other muscles belonging to the vocal tract.
With this data, we train the artificial intelligence model to decode the neural signal and transform it into the sounds it wants to emit, explains Maitreyee Wairagkar.
Before the BCI, Harrel had to repeat what he wanted to say several times to his care team, who interpreted it from the context.
Collection
[
|
...
]