Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science Technology

Brain Implants Can Detect What Patients Hear 75

kkleiner writes "A group of 15 patients suffering from either epileptic seizures or brain tumors volunteered to allow scientists to insert electrodes into their brains. After neurosurgeons cut a hole in their skulls, the research team placed 256 electrodes over the part of the brain that processes auditory signals called the temporal lobe. The scientists then played words, one at a time, to the patients while recording brain activity in the temporal lobe. A computer was able to reconstruct the original word 80 to 90 percent of the time."
This discussion has been archived. No new comments can be posted.

Brain Implants Can Detect What Patients Hear

Comments Filter:
  • by Foxhoundz ( 2015516 ) on Wednesday February 15, 2012 @06:08PM (#39052761)
    We need more articles like this. It's an encouraging progress in neuroscience.
  • by GodfatherofSoul ( 174979 ) on Wednesday February 15, 2012 @06:11PM (#39052805)

    My dreams of a womanspeak translator implant are on the horizon. Now, if they can just develop the technology to translate "Leave me the fuck alone until the game is over." into a more palatable sentence...

    • Hunny, if you wish to talk to me about X during the game, I'll be more then happy to talk to you Y when is on. There you go...
      • Re: (Score:3, Funny)

        by dreamchaser ( 49529 )

        Hunny, if you wish to talk to me about X during the game, I'll be more then happy to talk to you Y when is on.

        There you go...

        You obviously haven't studied enough Chaos Theory to think womanspeak breaks down into anything logical.

        • Hunny, if you wish to talk to me about X during the game, I'll be more then happy to talk to you Y when is on.

          There you go...

          You obviously haven't studied enough Chaos Theory to think womanspeak breaks down into anything logical.

          He sounds fairly illogical to me, but that could just be the bad syntax on his part.

    • by Anonymous Coward

      "Leave me the fuck alone until the game is over."

      That's what I tell my girlfriend during our weekly Pathfinder game!

      Then she kills my character, 'cause she's the GM. :-(

    • dude, you're meant to be doing man things, like reaching high objects, changing tyres, and opening stuck jar lids.

  • Tech Support (Score:5, Insightful)

    by swsuehr ( 612400 ) on Wednesday February 15, 2012 @06:12PM (#39052823) Homepage
    Earlier in my career when I had to do level 1 tech support I might have liked opportunity to cut holes in skulls to make sure people heard what was being said. However, *hearing* what's being said and actually processing that into meaningful and actionable instructions are two different things.
    • mod parent up. When you speak sense to certain folks, they hear crazy talk. And, conversely, crazy talk (e.g., anything said by pRick Santorum) makes sense to them.

      • Indeed. Such as listening to their brains fry when you ask them this:

        So if you believe that the government shouldn't be involved in religion at all...you obviously believe that polygamy should be legal right Mr. Romney?
  • by idontgno ( 624372 ) on Wednesday February 15, 2012 @06:19PM (#39052949) Journal

    but the first thing I thought of when I read that scientists can now detect what is being heard is: "I wonder before the copyright police make this implant technology mandatory in order to catch unlicensed listening?"

  • Soon everyone will be forced to have such an implant, so that people can be properly billed for every music they hear.

  • That the electrical signals received by the brain from the ear would actually directly correspond to the actual soundwaves received by the ear...

    I'm sorry... but in what way is this any more revolutionary in discovery than the telephone?

    • by PCM2 ( 4486 ) on Wednesday February 15, 2012 @06:46PM (#39053253) Homepage

      That the electrical signals received by the brain from the ear would actually directly correspond to the actual soundwaves received by the ear...

      I'm sorry... but in what way is this any more revolutionary in discovery than the telephone?

      It's brain research. Plain and simple.

      They already have devices that can translate the sound waves received by the ear into electrical impulses that are sent directly to the auditory nerves to be interpreted by the brain. They're called cochlear implants.

      This, on the other hand, is reading how the other end of the line interprets the impulses -- what happens within the brain when the electrical impulses are received. We still don't know all that much about how the brain really works. But when you can read changes in the brain with sufficient fidelity to be able to deduce what word the brain is thinking about, you can be pretty sure your hunch about how the brain works is correct.

  • by Anonymous Coward

    From the PLoS paper: "To assess decoding accuracy, the reconstructed spectrogram is compared to the spectrogram of the original acoustic waveform". The number of words used in the experiment was 47 and the reconstructed spectrogram was compared to the 47 spectrograms of acoustic waveforms used in the experiment. That could be done with 90% accuracy. If the input waveform would have been unknown, the reconstruction would not aid at all in knowing what word the patient listened to.

    • If the input waveform would have been unknown, the reconstruction would not aid at all in knowing what word the patient listened to.

      Why do you say "not aid at all"? Audio can be reconstructed from spectrograms, and computer matching algorithms can match spectrograms to words (that's basically how they usually do speech recognition). So while accuracy might not have been quite 90% on an open set of unknown words, the procedure would still aid at least a little.

  • Because a microphone that is on a person's body is going to pick up everything that person hears as well.

    And for that matter, it will probably be loads more reliable than trying to decode electrical signals that we are only just beginning to comprehend.

    • by alphamax ( 1176593 ) on Wednesday February 15, 2012 @06:36PM (#39053161)

      Because a microphone that is on a person's body is going to pick up everything that person hears as well.

      And for that matter, it will probably be loads more reliable than trying to decode electrical signals that we are only just beginning to comprehend.

      Experiments such as this one are the reason we are beginning to comprehend the electrical signals in the brain. The goal of the experiment isn't to understand WHAT the patients are hearing, but HOW the patients are hearing.

      • by mark-t ( 151149 )
        Hey, I'm not dissing finding out more about how the brain works.... but being able to detect what a person hears is really no more complicated than putting a microphone near that person.
        • Being able to reconstruct the sound from analyzing electrical signals in the brain, however?

          This could be groundbreaking research. Especially for hearing implants.

          • by mark-t ( 151149 )
            Well, if they are hearing the sound at the time, then no... I'm afraid I don't think that's a big a deal. The electrical signals transmitted from the eardrum to the brain would naturally have a pretty tight correspondence with the sound waves received, and I would naturally expect that electrical activity in the brain corresponding to regions associated with hearing would be similarly correspondent. The breakthrough will happen when they can construct the sound directly from what they are thinking, and n
            • by Z34107 ( 925136 )

              A pity the authors neglected to site your previous neurological research. Clearly you've got this all figured out.

            • The electrical signals transmitted from the eardrum to the brain would naturally have a pretty tight correspondence with the sound waves received

              Near the ear (i.e., auditory nerve), that would be mostly correct. For low frequencies (<2 kHz), the summed impulses on the auditory nerve look very much like the acoustic signal. For high frequencies, they look only like the overall envelope of the acoustic signal, but that's still pretty good and good enough to figure out most speech.

              and I would naturally expect that electrical activity in the brain corresponding to regions associated with hearing would be similarly correspondent.

              Mostly incorrect. Neurons in cortex fire very slowly (on the order of Hz), often eve nonce per "signal" (loosely defined, discrete chunk of audio). The pathway goes audito

        • Oh, I follow you now..."Brain implants can detect what patients hear"...so do microphones, clever :)
        • by Anonymous Coward

          Can't see the forest for the trees.

          Summary seems to be hung up on the reconstructed words aspect, so not entirely your fault. Still, go back and reread what alphamax posted.

          "The goal of the experiment isn't to understand WHAT the patients are hearing, but HOW the patients are hearing."

          The experimenters already knew what the people were hearing, they were playing back prerecorded words. More formally, the question is "How does the human auditory system extract perceptually relevant acoustic features of speec

        • Hey, I'm not dissing finding out more about how the brain works....

          Really? After reading your other two posts it sounds an awful lot like you are.

    • Ugh, no. For one thing, your brain is similarly activated when you THINK about words as when you hear them, so this kind of work demonstrates the possibility of a brain/computer interface.

      It should also be possible to prompt you (or force you) to think about the word (either its meaning, or to hallucinate the sound, depending on where you link in) by injecting a current, enabling a more direct link to an external memory, such as the Internet.

      The HUGE catch in all this is it requires an inter-cranial el

      • Sorry, "intra-cranial," i.e. inside your skull. Otherwise the sensing is much more limited (EEG).
      • by mark-t ( 151149 )

        For one thing, your brain is similarly activated when you THINK about words as when you hear them

        While this has been postulated before, to the best of my knowledge, this premise has yet to be conclusively proven.

        Of course, being able to determine what words people are thinking of is a *HUGE* deal... and one that has almost frightening consequences.

  • Why? They can't afford a 16-bit microcontroller?

  • by Crypto Gnome ( 651401 ) on Wednesday February 15, 2012 @08:28PM (#39054413) Homepage Journal
    Breast Implants Can hear What you're Thinking!
    • Breast Implants Can hear What you're Thinking!

      Utter (udder?) nonsense. "Nice tits" was just a lucky guess.

  • by Un pobre guey ( 593801 ) on Wednesday February 15, 2012 @08:40PM (#39054525) Homepage
    Eventually, holes will not be necessary [wikipedia.org]. Better SQUIDS [wikipedia.org] + cuda [nvidia.com] = mind reading from a distance. By 2020 0r 2030 at the latest, I would conjecture.
  • I was contemplating following this line of research, but it seems that these guys got there before me. Must have read my mind!

  • Cowboys didn't get into simstim, he thought, because it was basically a meat toy. He knew that ... the cyberspace matrix was actually a drastic simplification of the human sensorium, at least in terms of presentation, but simstim itself struck him as a gratuitous multiplication of flesh input.

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...