Revolutionary brain implant enables speech in paralyzed ALS patient
UC Davis researchers built a brain-computer interface (BCI) that helps an ALS patient talk by turning his brain signals into speech with 97% accuracy. The system reads the brain's speech signals and creates a computer voice that sounds just like him—all in under 10 milliseconds.
How the system works
Tiny sensors in the part of the brain that controls talking pick up signals when he tries to speak. AI then translates these into sentences, adding real-life touches like tone and melody. To make it feel personal, the system uses old recordings of his voice from before ALS to clone how he used to sound.
Tech's potential and advantages
This tech gives people who've lost their voices—like those with ALS—a way to chat in real time using their own voice, not just robotic text-to-speech. It could make conversations more natural and help folks stay independent. The best part? It worked within minutes of being switched on, showing promise for quick use in clinics.