Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Medicine Science Technology

Brain-Computer Interface Makes Learning As Simple As Waving 37

vinces99 writes "Small electrodes placed on or inside the brain allow patients to interact with computers or control robotic limbs simply by thinking about how to execute those actions. This technology could improve communication and daily life for a person who is paralyzed or has lost the ability to speak from a stroke or neurodegenerative disease. Now researchers have demonstrated that when humans use this brain-computer interface, the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand (abstract). That means learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed."
This discussion has been archived. No new comments can be posted.

Brain-Computer Interface Makes Learning As Simple As Waving

Comments Filter:
  • by Anonymous Coward on Wednesday June 12, 2013 @04:04AM (#43982655)

    Title is misleading clickbait

  • by Cenan ( 1892902 ) on Wednesday June 12, 2013 @04:26AM (#43982749)

    The title is very much misleading, as per usual.

    /. Title: Brain-Computer Interface Makes Learning As Simple As Waving
    Article Title: New tasks become as simple as waving a hand with brain-computer interfaces.

    Now, University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.

    It's not about learning, it's about an interface that makes controlling a robotic arm as easy as if it were your actual arm. Big difference.

  • by mandginguero ( 1435161 ) on Wednesday June 12, 2013 @12:39PM (#43986095)

    Compared to other BCI's, this does sound like the easiest to learn to control. Some of the other fastest versions rely on what are called steady state visual evoked potentials (SSVEPs) which rely on entraining your visual cortex to the same frequency as a flashing light on a computer screen. Once entrained, deviations by looking at other parts of the screen can be detected. This however leaves you with fairly few options for different commands on the screen at a given time. Another one of the faster acquisition BCIs is based on the P300 brainwave. These work typically for selecting a character from a grid of letters/numbers on a computer screen. The computer will cycle through all the characters, flashing each one for a brief moment. The user's task is to focus on the character they'd want, and when it comes up flashing, the brain has a slightly stronger response about 300 milleseconds afterwards. While both the SSVEPs and P300 systems only take about 10-15 minutes to figure out how to use (compared to 6-8 hours for learning to control a brain rhythm like the mu wave), they leave very few options in terms of commands you could execute.

    On to the practical side of controlling artificial limbs by thinking about them, this is missing a crucial piece of the puzzle - feedback! How well could you close your cyborg hand around a fragile plastic cup without dropping it or smashing it if you can't 'feel' the surface and the flexibility of your object in hand? Executing the movement is indeed fancy, but to do it well you'll need to implant some more electrodes in the parietal cortex where you map body sensations, which connect reciprocally with the motor representations of the same body parts. Also, this would likely only work for someone who becomes paralyzed or loses a limb. Those with congenital problems, unless turned into a cyborg during infancy, would never develop the proper representations of limbs they don't own/can't use.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...