The human brain has long been too complex to fully decode, but artificial intelligence is beginning to change that.
A 52-year-old woman, paralysed by a stroke 19 years ago, participated in a Stanford University study that allowed her to communicate through thought alone. Fitted with a tiny array of electrodes in her frontal lobe, her internal monologue appeared as text on a screen, decoded in real time by an AI system. This breakthrough offers hope for patients with paralysis or neurodegenerative conditions like ALS.
In a parallel study in Japan, researchers developed a “mind captioning” method that translates brain activity into detailed visual descriptions using non-invasive brain scans and AI tools. Together, these advances give scientists an unprecedented view into the human mind and could eventually transform communication for everyone.
Neuroengineer Maitreyee Wairagkar predicts that brain-computer interface technology will be commercialized in the next few years. Companies like Neuralink are already working on chips designed to bring these lab breakthroughs into everyday life.
Brain-computer interfaces have been in development for decades. In 1969, Eberhard Fetz demonstrated that monkeys could control a meter with a single neuron if rewarded with food. Around the same time, Spanish scientist Jose Delgado remotely stimulated a bull’s brain, stopping it mid-charge. Today, AI has accelerated these decades of research, moving thought-to-text communication from theory to reality.
