A leap in technology at the crossroads of UC San Francisco and UC Berkeley has given a paralyzed woman, Ann, a chance to communicate. This is a story of how Artificial Intelligence (AI) played a crucial part in it.
A New Voice for Ann
Post a devastating brainstem stroke, Ann was trapped in silence until a Brain-Computer Interface (BCI) paved a way for her to articulate her thoughts. With a skyrocketing speed of 80 words per minute, the BCI has left her previous 14 words per minute device far behind.
Tiny Wires and Brain Signals: How They Work Together
A fine sheet of 253 electrodes, strategically placed on Ann’s brain, became her new vocal cords. These electrodes capture the brain signals directed towards her speech muscles, creating a bridge from thought to text.
How Brain Signals Turn into Words
Breaking down speech to its basic unit, phonemes, the researchers simplified the translation from brain signals to text. This meticulous decoding is a cornerstone in making communication fluid.
Breathing Emotions into Avatars
Besides text, the BCI brings a virtual avatar to life, mirroring Ann’s facial expressions. The blend of AI-driven software from Speech Graphics and the BCI orchestrates a visual display of her emotions and spoken words, enriching the conversation.
Towards a Future: More Voices Restored
This special work is more than a big win for Ann. It’s a step towards making a system that the FDA approves, where thoughts can turn into words even if someone can’t speak. Soon, talking could become easy for many people, all thanks to smart people and AI.
Check out our previous blog on AI in modern healthcare.