
"We're racing towards a future in which devices will be able to read our thoughts. You see signs of it everywhere, from brain-computer interfaces to algorithms that detect emotions from facial scans. And though the tech remains imperfect, it's getting closer all the time: now a team of scientists say they've developed a model that can generate descriptions of what people's brains are seeing by simply analyzing a scan of their brain activity."
"The implications of such technology are a double-edged sword: on the one hand, it could give a voice to people who struggle speaking due to stroke, aphasia, and other medical difficulties, but on the other hand, it may threaten our mental privacy in an age when many other facets of our lives are surveilled and codified. But the team stress the model can't decode your private thoughts. "Nobody has shown you can do that, yet," Huth added."
Mind captioning uses multiple AI models to translate brain activity into textual descriptions of viewed content. A deep language model analyzed captions from over 2,000 short videos to produce unique meaning signatures. A separate model learned to map MRI scans from six participants watching the same videos to those signatures. The combined decoder predicts a meaning signature from a new brain scan and an AI text generator retrieves and refines matching sentences into descriptive captions. Current implementations cannot decode private, unobserved thoughts. The technique could aid people with speech impairments but poses mental privacy risks.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]