Think And Click 316
cecil36 writes: "Yahoo! has reported that scientists have discovered neural technology that allowed a monkey to control a mouse cursor using the brain. ... Further reading states that this technology can be used with the paralyzed or those with Lou Gehrig's Disease to allow them to use their computers."
nothing new (Score:5, Informative)
The Cyberlink Interface: Hands Free Brain-Body Actuated Control for Augmentation and Enhancement of Human Computer Interaction [csun.edu](produced in 1999), and their website [brainfingers.com]
And an article from last year about a similar device. [google.com]
Re:old news (Score:3, Informative)
The Air Force must not have gotten too far because DARPA is currently requesting proposals for research leading to a Brain-Machine interface.
The problem with most brain-machine interfaces is the skull and fluid surrounding the brain. Both of these elements serve as spatial and temporal filters degrading the usefullness of electrodes placed outside the skull as control sensors.
Just a small clarification... (Score:2, Informative)
Commercial applications already exist (Score:2, Informative)
brain actuated mouse cursor gizmo out for years.
http://www.brainfingers.com/cyberlink.htm
Interesting press release (Score:3, Informative)
It's kinda weird when you know a bit about the work behind these press releases, and then see how it is actually presented to the "lay" public.
Personally, I think the project has a low probability of success. A neural prosthetic device should be interfaced with as peripheral part of the nervous system as possible. This group has chosen to use as abstract a part of the nervous system as possible. But maybe they'll prove me wrong.
Further Reading (Score:4, Informative)
The actual reseacrh described in the Yahoo article using implanted electrodes seems a bit strange - though the claim to have identified a few individual neurons is interesting.
Most of the other groups are working with stick-on electrodes. At the moment all they can do is move a mosue around a screen and click, but progress seems to be good - Correct recognition is around 70% after 5 one-hour sessions, which sounds impressive to me. The big obstacle to getting this into service for real people with disabilities is that the hardware is currently a bit chunky, especially the EEG machine. But we all know what happens to hardware, very, very quickly.
Oh - and, yes, the guy i talked to says the thing that secretlty drives him is eventually using it to play Quake. (Wonderful thing, altrusim)
Now wouldn't that be cool.(Unfortuantely you have to shave your head, I think!)
More information (Score:3, Informative)
Daniella is part of Richard Anderson's lab [caltech.edu] at Caltech. They research motor planning and spatial orientation. It is a very interesting place.
As pointed out in the article, the area from which they record makes this experiment significantly different from previous ones. Several lab have done similar work, but they were less sure of the origin of their signal. Much of the sensory and motors areas of the cortex are right next to each other. It was not clear whether the recorded signals were motor signals or sensory signals driven by stretch sensors within muscles or something similar. The area Daniella records from is fairly far away from sensory cortex. There is much less chance that they are recording feedback from the sensory side. For comparison, examine an older story [slashdot.org] from a team of competitors.
You can buy this off the shelf today (Score:2, Informative)
Re:Integrating protheses in the neural loop (Score:3, Informative)
People without the most important of these channels basically have Parkinsons. They can initiate action, but the action is delayed because they have a difficult time ending their previous action. Similarly, they shake violently the entire time, because of the grossly overestimated signals coming from their motor cortex. This makes me wonder how fine the mouse movement control could have been on the part of the monkey. People with parkinsons are severaly disabled, but might still be able to complete such a large-scale task with 90% accuracy, too.
To really use this for prosthetics, you'd have to not only detect the impulse to commit the action, but you'd also probably have to send signals back up via the remaining somatic nerves [in the case of amputation] or directly into the brain [in the case of degenerative disease.] Mental signals are not a matter of On or Off, they are on a sliding scale from strongly inhibitory to strongly excitatory. Signal regulation is the golden egg.
If these are the few neurons responsible for initiation of reaching action, how can the rest of the system determine when the monkey means to reach normally or reach virtually? I'd like to know whether there was any twitch in the arm when the mental cursor was moving.
There is a fantastic difference between "up, down, left & right" and "reach for and grasp object 3 feet from here." It's nice to see enthusiasm, but it's a little premature.
Done on Humans in March 1998 (Score:2, Informative)