
New Brain Device Is First To Read Out Inner Speech 18
An anonymous reader quotes a report from ScientificAmerican: After a brain stem stroke left him almost entirely paralyzed in the 1990s, French journalist Jean-Dominique Bauby wrote a book about his experiences -- letter by letter, blinking his left eye in response to a helper who repeatedly recited the alphabet. Today people with similar conditions often have far more communication options. Some devices, for example, track eye movements or other small muscle twitches to let users select words from a screen. And on the cutting edge of this field, neuroscientists have more recently developed brain implants that can turn neural signals directly into whole words. These brain-computer interfaces (BCIs) largely require users to physically attempt to speak, however -- and that can be a slow and tiring process. But now a new development in neural prosthetics changes that, allowing users to communicate by simply thinking what they want to say.
The new system relies on much of the same technology as the more common "attempted speech" devices. Both use sensors implanted in a part of the brain called the motor cortex, which sends motion commands to the vocal tract. The brain activation detected by these sensors is then fed into a machine-learning model to interpret which brain signals correspond to which sounds for an individual user. It then uses those data to predict which word the user is attempting to say. But the motor cortex doesn't only light up when we attempt to speak; it's also involved, to a lesser extent, in imagined speech. The researchers took advantage of this to develop their "inner speech" decoding device and published the results on Thursday in Cell. The team studied three people with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke, all of whom had previously had the sensors implanted. Using this new "inner speech" system, the participants needed only to think a sentence they wanted to say and it would appear on a screen in real time. While previous inner speech decoders were limited to only a handful of words, the new device allowed participants to draw from a dictionary of 125,000 words. To help keep private thoughts private, the researchers implemented a code phrase "chitty chitty bang bang" that participants could use to prompt the BCI to start or stop transcribing.
The new system relies on much of the same technology as the more common "attempted speech" devices. Both use sensors implanted in a part of the brain called the motor cortex, which sends motion commands to the vocal tract. The brain activation detected by these sensors is then fed into a machine-learning model to interpret which brain signals correspond to which sounds for an individual user. It then uses those data to predict which word the user is attempting to say. But the motor cortex doesn't only light up when we attempt to speak; it's also involved, to a lesser extent, in imagined speech. The researchers took advantage of this to develop their "inner speech" decoding device and published the results on Thursday in Cell. The team studied three people with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke, all of whom had previously had the sensors implanted. Using this new "inner speech" system, the participants needed only to think a sentence they wanted to say and it would appear on a screen in real time. While previous inner speech decoders were limited to only a handful of words, the new device allowed participants to draw from a dictionary of 125,000 words. To help keep private thoughts private, the researchers implemented a code phrase "chitty chitty bang bang" that participants could use to prompt the BCI to start or stop transcribing.
Job Obsolescence (Score:3)
Re: (Score:3)
I like how anonymous posters know one another personally and recognize each other's posts. Not suspicious at all!
Re: (Score:2)
One more ruble for this AC. How much more obvious can it be?
The first jobs lost to AI, paid hate speech.
1999 cat brain is calling (Score:3)
https://www.wired.com/1999/10/... [wired.com]
Wired Magazine - Oct 7, 1999 12:00 PM
A Cat's Eye View
In a dramatic demonstration of mind reading, neuroscientists have created videos of what a cat sees by using electrodes implanted in the animal's brain. Yang Dan, Fei Li and Garrett Stanley of the University of California, Berkeley, were able to reconstruct in startling detail scenes flashed before a cat's eyes. The reconstructed scenes clearly demonstrate [] ...
The researchers attached electrodes to 177 cells in an anesth
Re: (Score:2)
You still need to beat them until they start thinkIng what you want...
ugh (Score:2)
I remember a time when I would've been really excited and hopeful about this. Where the aliens who are supposed to come decide we're not ready for technology?
And so it begins... (Score:5, Insightful)
Whilst this is great for people who cannot communicate by other means, the potential downward spiral of this is frightening. Thought is the last bastion of freedom, you can think whatever you want, no limits. The development of this kind of tech will go from physically connected medical devices, to physically connected interrogation devices (if you don't consent, you're guilty, the whole "what do you have to hide if you're innocent" trope), then the leap to wireless devices and finally just scanners everywhere that scan people's thoughts. Feeling hungry? Here's an ad for your favourite food. Feeling angry, wishing you could just slap whoever cut you off in traffic? Here's an instant fine or arrest. A dystopian nightmare.
Chitty Chitty Bang Bang? (Score:2)
To help keep private thoughts private, the researchers implemented a code phrase "chitty chitty bang bang" that participants could use to prompt the BCI to start or stop transcribing.
What if you want to write an article about that book/movie?
Detrimental Invention (Score:3, Informative)
If this is indeed true and goes ahead this could be a contender for one of the most detrimental inventions for human kind next to nuclear weapons.
Think privacy, think interrogations, think political persecution, think manipulation, etc.
I know it's great tech for people who physically need it but all in all I think there is much more potential harm than good from this.
Re: (Score:1)
Real time is not what you think (Score:2)
Altman's brain interface to be known as Unit 731 (Score:1)
If it's not clear why Altman wants his Mengele Labs, this is it. Altman knows just making a transformer model bigger isn't going to work, so now he's looking for another idea to steal, even if it's right out of decades old ScIFi.
Can't fraud your way into reasoning? Just use a reasoning front end. Where do you get one? Wire your AI up to a human brain. Unfortunately, Musk is ahead of you on that. But hey, at least you're still friends with Trump, he'll turn his back on your medical experimentation and
social media (Score:2)
Social media already gives me too much access to too many people's inner monologues that should have been kept "inner", thanks.