Forgot your password?
typodupeerror
Science Technology

Neuroscientists At MIT Developing DNI 126

Posted by Zonk
from the jack-in-and-tune-out dept.
coolphysco1010 wrote to discuss the possible development of a direct neural interface, ala 'The Matrix', that could eventually allow for instant object recognition. From the article: "Now, neuroscientists in the McGovern Institute at MIT have been able to decipher a part of the code involved in recognizing visual objects. Practically speaking, computer algorithms used in artificial vision systems might benefit from mimicking these newly uncovered codes ... In a fraction of a second, visual input about an object runs from the retina through increasingly higher levels of the visual stream, continuously reformatting the information until it reaches the highest purely visual level, the inferotemporal (IT) cortex. The IT cortex identifies and categorizes the object and sends that information to other brain regions."
This discussion has been archived. No new comments can be posted.

Neuroscientists At MIT Developing DNI

Comments Filter:
  • by theufo (575732) on Saturday November 12, 2005 @06:31AM (#14014706) Homepage
    for adult entertainment.
  • by Mecdemort (930017) on Saturday November 12, 2005 @06:37AM (#14014716)
    I'm going to be first in line for the new computer interface brain implants. Hopefully they don't run windows.
    • Consider me second in line. I can't tell you how much of an improvement for the species this will be. Better yet, the blind may finally have a hope of actually receiving ACTUAL replacement vision, not a poor substitute. :D
    • As a nerd I am often on the cusp of technology and I am generally not afraid of picking up first generation products. However I think I can confidently say that when it comes to brain implants I'm going to wait a bit for the dust to settle before taking the plunge.
      • I'm thinking of Walter a John Williams novel, 'Hardwired'. The protagonist has cyber-eyes, and bought them just when the company had gotten most of the bugs out, but while they were still trying unusual features like sepia tone overlays, film nior settings, and the like to see where consumer interest lay. They then dropped all these features to make cheaper, more basic designs they could sell to the broadest possible market. I'm with Alexander Pope on this one:

        "Be not the first by whom the new are tried -
        No
    • In brain-implanted america, windows runs YOU!
      =\
    • by Anonymous Coward
      They don't work well. I was involved in artificial vision implant work a few years ago, and the current spread from the electrodes is too large to stimulate the small numbers of neurons necessary to give anything other than a "light blob". The lab put a grid of 5x5 electrodes in a human eye and the human reported on what they could perceive from current applied to it: the surgery was only done on people about to have their eye removed anyway for medical reasons, to be at little risk of hurting a good eye.

      If
      • My girlfriend has a brain implant for treatment for her dystonia, she has a timing device in her chest with a wire running up her neck and into her skull and brain, delivering direct stimulation to certain nerurons and reducing involuntary muscle contractions.
      • Surprisingly, the implants for artifical hearing work very well: having the auditory nerve laid out, low frequency to high frequency, along the bony tube of the cochlea helps localize the current to just the nerves you want to hit with each electrode.

        Does that work if the nerve has died?

    • Imagine the possibilities, now even the slashdot crowd can get girls, just hack her brain ;-)
  • Poor Monkeys (Score:3, Insightful)

    by el americano (799629) on Saturday November 12, 2005 @06:44AM (#14014725) Homepage
    I don't want to see the results when they start trying to recreate those nueral patterns in the monkeys brains. Honesty, to say that observing these kinds of patterns brings us any closer to injecting images directly into the brain, when we have so little technology to do that (knives and chemicals basically) is ludicrous. I suppose the writer, rather than the scientists, can probably take all the credit for that exaggeration.
    • Well, I have first hand experience from experimenting on a device that makes you feel motion [slashdot.org] that had me seeing flashes of light when I turned the levels all the way up. Admittedly that's not injecting images into my visual cortex, but other than being hit in the head REALLY hard, I've never seen such a thing before.

      Considering that they've been injecting audio into our brains [google.com](yes, I have one of those too) for ages now I don't see that they have much choice but to finish developing the technologies.

      • Not into your brain. It says here [neurophone.com] that it stimulates "a tiny organ in the inner ear" that is sensitive to ultra sonic sound. I think it would be interesting to try to stimulate nerve ending directly, whether it be sight or sound, but that was not this experiment.

        I also find it interesting that they call it an ultrasonic neural stimulation instrument for brain entrainment(?) Was there some reason that they can't call it a hearing aid or artificial hearing? That would be the best way to sell it, so I can only
    • Remarkably, the classifier found that just a split second's worth of the neural signal contained specific enough information to identity and categorize the object, even at positions and sizes the classifier had not previously "seen."

      so NOT known ojbects and KNOWN objects has similarly patterns for neural signals in monkey brains for conclusions! SO find repeatables on nerual patterns is keys i htink on market producables.

    • Why wouldn't they use electricity? Electricity can stimulate neurons...
  • Chair (Score:3, Funny)

    by GloomE (695185) on Saturday November 12, 2005 @06:45AM (#14014727) Homepage
    I'm looking to purchase a dentist chair.
    Hole in the headrest preferable.
  • It's about time! (Score:3, Insightful)

    by Anonymous Coward on Saturday November 12, 2005 @06:54AM (#14014746)
    The implications for using this technology to cure blindness (one day, obviously not immediately) are wonderful! This is the kind of thing science was really meant for - helping humans live better lives. Kudos to MIT!
    • Re:It's about time! (Score:2, Informative)

      by Yvanhoe (564877)
      Immediatly, a large range of blindness can be cured by implants, either by putting a CCD array inside the retina or, in case of damage in the optic nerve, a camera can be wired to the visual cortex. Right now, some blind people see ( with a low res, b&w image but see nonetheless) thanks to implants.

      link [seeingwithsound.com]
      other link [brown.edu]

      But yes, with the technology presented in the article, I suppose one could even cure blinds that have a damaged visual cortex.
    • yeah, may be after we finally create those damn nanobots :P
    • Re:It's about time! (Score:5, Informative)

      by Anonymous Coward on Saturday November 12, 2005 @11:13AM (#14015291)
      Kudos to MIT!

      Every visual neuroscientist, ever, has been working on "deciphering part of the code involved in recognizing visual objects." Poggio and DiCarlo's contribution is mostly that they were able to record from a large number of neurons simultaneously in the inferotemporal cortex (IT). It's a logical (but interesting, to be sure) progression of work that has been done for decades in IT -- most of that work done elsewhere.

      Neural prosthetics and DNI are the bullshit that people trot out to make neuroscience interesting to the public. It's worth pointing out that neither of the actual named scientists in this work raise the possibility, and in fact, other than the abstract, there's nothing that even hints at the idea. These guys aren't working on a DNI. They're doing basic science. Years, decades down the road, some engineers might take the work that built on Poggio and DiCarlo's work and turn it into a DNI. Or at least, we can so hope.

      Name a university, and I can guarantee that the odds are that they'll have some basic science research underway with as much potential for the betterment of society as this stuff. So when you say "kudos to MIT" like this, remember that you're praising their PR department, not their scientists.
      • Mod parent up, since its parent is rather misleading for the uninformed. I would like to add that most of the knowledge on visual pathways comes from live monkey research, and is only partially known to translate to human vision.
    • That is, of course, until the sex industry comes along... Using it too much might *cause* blindness! (Disclaimer: May or may not be scientifically accurate.)
  • Matrix? (Score:4, Interesting)

    by Auckerman (223266) on Saturday November 12, 2005 @07:07AM (#14014762)
    The article reads more like they are reverse engineering pattern recongition systems as the brain sees and interperates objects, which sounds closer to the movie Brainstorm [imdb.com].
    • Re:Matrix? (Score:2, Informative)

      by Tune (17738)
      Or Strange Days [imdb.com], which could be considered a (lose) remake of Brainstorm.

      Well, actually the article focusses on intercepting the sensoric data and making sense of it. I believe scientists have for some time been able to make sense of the basic sensoric data; stuff like using a cat's eye to produce webcam quality images. This research seems directed at interpreting the signals at a much deeper level.

      Though very interesting, it's still a one-way extraction process (ie. *not* synthesis) which is just completel
      • Re:Matrix? (Score:2, Offtopic)

        by bsartist (550317)
        Or Strange Days, which could be considered a (lose) remake of Brainstorm.

        Sigh. Loose is the opposite of tight. Lose is the opposite of win. What's so damn difficult about this?
        • Re:Matrix? (Score:3, Funny)

          by Haydn Fenton (752330)
          Sigh.
          Looser is what grammar and spelling nazi's should be. Loser is what grammar and spelling nazi's are.
          What's so damn bad about someone making a mistake so minor that anybody with an IQ higher than a banana can still understand?
          ;-)
          • What's so damn bad about someone making a mistake so minor that anybody with an IQ higher than a banana can still understand?

            It's jarring and shows a lack of care, that's what. If you can't be bothered to spell right, what does that say about the content of your words?

  • No 12 monkeys (Score:4, Informative)

    by noc_man (917321) on Saturday November 12, 2005 @07:10AM (#14014766)
    I read an article many years ago about them doing this to live human patients. Via a fiber cable brain wet-ware implant, a blind man was able to discern colors and rudimentary objects. He did have a short seizure during the interview; however, once the subject got passed that he immediately requested that the researchers continue.
    Unfortunately this was so long ago I cannot remember the magazine or relocate the article. But googling artificial vision shows a few parts of history and HOWSTUFFWORKS has a full set of details

    http://health.howstuffworks.com/artificial-vision. htm [howstuffworks.com]
    • Re:No 12 monkeys (Score:3, Interesting)

      by Timeburn (19302)
      IIRC, it was in Wired, circa about 1999 or 2000. The article covered research in South America (banned in the US), on a patient who had lost his vision, but whose optic nerve was intact. They interfaced directly into the nerve, stimulating it manually at first (This is when the seizure occurred).

      The project was apparently quite successful, as the patient was able to move about the facility, pick up a phone from a desk, and even drive a car around the parking lot. Fairly low-res input, but enough to see s
    • Re:No 12 monkeys (Score:4, Informative)

      by groomed (202061) on Saturday November 12, 2005 @08:12AM (#14014855)
      The Vision Quest [wired.com] in Wired 10.09 of September 2002.

      The article, as well as the feasibility of Dr. Dobelle's (who has died in 2004) research, are sketchy at best. Apply truckload of salt.
  • Fabulous (Score:2, Funny)

    by Centurix (249778)
    Maybe they could simulate the feeling of taking a really great dump.
    • Re:Fabulous (Score:1, Funny)

      by Anonymous Coward
      Looks like some fool marked you as troll, but I understand. I took a dump tonight that made my fucking legs ache. Now that's reality!
    • There's nothing as over-rated as bad sex,
      And there's nothing as under-rated as a good dump.
  • Just recordings (Score:3, Interesting)

    by venicebeach (702856) on Saturday November 12, 2005 @07:15AM (#14014772) Homepage Journal
    Seems to me they are just recording from IT neurons. There's no input to the cortex. I haven't read the science paper (is it out yet?) but it really seems like they are just analyzing the firing patterns of IT neurons while the monkey looks at objects. Nothing new here technology-wise.
    • Re:Just recordings (Score:3, Insightful)

      by venicebeach (702856)
      OK I just read the Science article. What's interesting about it is that they got recordings from a large population of neurons in IT during object recognition and have some cool analyses of the kinds of information that can be extracted from the capture, e.g. how large a population of neurons you need to accurately identify the object, how well the neurons discriminated among the categories and generalized across the same image at different sizes and positions, etc.

      Important to remember that these mon
      • Re:Just recordings (Score:5, Informative)

        by FleaPlus (6935) on Saturday November 12, 2005 @07:47AM (#14014816) Journal
        This reminded me of the research by Quian Quiroga et al in which they performed single-neuron recordings from MTL (upstream of IT, if I recall correctly) in humans. In that study they found neurons which would respond selectively to particular objects, such as Jennifer Aniston, Halle Berry, and the Sydney Opera House. Here's the abstract:

        R. Quian Quiroga, L. Reddy, G. Kreiman, C. Koch & I. Fried Invariant visual representation by single neurons in the humanbrain. [caltech.edu] Nature (2005) 435, 1102-1107

        It takes a fraction of a second to recognize a person or an object even when seen under strikingly different conditions. How such a robust, high-level representation is achieved by neurons in the human brain is still unclear. In monkeys, neurons in the upper stages of the ventral visual pathway respond to complex images such as faces and objects and show some degree of invariance to metric properties such as the stimulus size, position and viewing angle. We have previously shown that neurons in the human medial temporal lobe (MTL) fire selectively to images of faces, animals, objects or scenes. Here we report on a remarkable subset of MTL neurons that are selectively activated by strikingly different pictures of given individuals, landmarks or objects and in some cases even by letter strings with their names. These results suggest an invariant, sparse and explicit code, which might be important in the transformation of complex visual percepts into long-term and more abstract memories.
    • At the end of the article:

      It was quite surprising that so few IT neurons (several hundred out of millions) for such a short period of time contained so much precise information.

      That *is* an interesting result, since (computer) neural net research generally tends to favour a designs with a complete overkill in the number of neurons.

      "If we could record a larger population of neurons simultaneously, we might find even more robust codes hidden in the neural patterns and extract even fuller information," Poggio


    • We already have something called transcranial magnetic stimulation. See:

      http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnum b er=1300793 [ieee.org]
      http://groups.csail.mit.edu/vision/medical-vision/ surgery/tms.html [mit.edu] -- most relevant to discussion, has section on visual signal injection
      http://www.biomag.hus.fi/tms/ [biomag.hus.fi]
      http://www.mp.uni-tuebingen.de/mp/index.php?id=94 [uni-tuebingen.de]
      http://en.wikipedia.org/wiki/Transcranial_magnetic _stimulation [wikipedia.org]
      http://pni.unibe.ch/TMS.htm [unibe.ch]
  • I just finished reading "Neuromancer" (for about the 10,000th time) and so this seems pretty cool. But I've always wondered how they deal with the potential for allergies... from what I understand, in theory you can become allergic to basically anything at anytime without warning, so you wouldn't want to get a fancy new implant only to die of anaphyllactic shock while watching porn....
    • Let's hope that I don't get allergic to stupid comments, because then I would die of an anaphyllactic shock reading slashdot.
    • from what I understand, in theory you can become allergic to basically anything at anytime without warning

      You understand wrong. Medical implants are generally constructed of inert material that the body doesn't care about. I myself have a piece of titanium in my neck - it won't be causing a reaction anytime soon.

  • by Xochi77 (629021) on Saturday November 12, 2005 @07:43AM (#14014812) Homepage
    i am not at MIT, but I can tell you this aint about to happen any time soon.

    i am working on optical neuron-computer interfaces, and this is probably the most efficient and direct route for reading neurons. I know of researchers who can also stimulate neurons to fires via light, so in principle, we could build a complete neuroptical computers tomorrow... if neurons were not complete bastards to work with.

    you see, they just dont like to stay place. where i research, they often build tiny fences to keep them in place, but even then, they go shooting theyre axons anywhere they feel, with no concern for the feelings of the researcher.

    we also grow neurons on microchip surfaces, which allows for high speed and high resolution stimulation and reading of single neuron activity, but in two dimensions, which is excellent for retina etc.

    but the neuron-chip or old fashioned neuron-electrode are hard to place, and optical reading of neurons still has bugs to sort out (id guess from 4-10 years more basic research). whenever you see these cool brainscan pics with MRI etc, remember theyre resolution is on the order of millimeters, and thats a lot of complexity lost.

    http://www.biochem.mpg.de/mnphys/ [biochem.mpg.de] has a nice review of the problems involved, if you like hardcore solidstate chemistry, silicon physics, and neurobiology
  • OMFG (Score:3, Funny)

    by russint (793669) on Saturday November 12, 2005 @07:57AM (#14014830) Homepage
    this could make LSD obsolete!
    • I know your joking, but in most dystopian sci-fi I've seen or read, the future "drug" of choice is some kind of technological device. Especially in the anime (nerd alert!) Serial Experiments Lain.
      • Ahhh, and the device he is reffering to is of course... Accela!

        According to wikipedia
        Not a chemical drug, but rather a nanomachine delivered in pill form. It increases the processing speed of the brain, making its user feel that time has slowed down.
    • At first I read this and was nodding in agreement, "Yes, I can see that if they develop this, we'll have no more need for monitors."

      Then I remembered my college days. (It's early this morning...)

    • by rabel (531545)
      Yeah, but have you used your direct neural interface, ON WEED?
  • in the next few days. The SFN just started in Washington.
  • I remember reading a long time (at least 10 years, perhaps 20?) ago about direct stimulation of the visual cortex;
    now at the time they were just doing a few blobs intended to help the blind.

    This looks like it is moving a bit further up the chain a bit which should be interesting;
    in the end it is just vision however.

  • What if... (Score:3, Interesting)

    by nsasch (827844) on Saturday November 12, 2005 @08:52AM (#14014918)
    What if the IT cortex was bypassed: The computers would get or simulate the input, and recognize and categorize the object and the computer would send that data directly to the other parts of the brain. Now the human doesn't see the ball, but knows there's a ball in front of them, and it's red, and about the size of their head, etc (all the details), but doesn't see it, just has a "feeling" that a ball is there.
    • That's a really interesting idea - bypass the senses and go directly to the meaning portions of the brain. I wonder if you'd be able to tell your own conclusions from what was being fed to you?

      This kind of stuff both excites and scares me. Whenever I hear about electronics/brain interaction (like the story about monkeys moving robotic arms using their brains [bbc.co.uk]), I think of the cyborg possibilities. The most interesting one to me is the ability to supplement your own faulty memory with a hard drive and your
      • A few other possibilities:

        - Sonar-vision, i think seeing in 3D would help us understand the world better.
        - Telepathy, if you get the input/output working and hook up and antenna, watch out for spam though.
        - Computer aided augmentation, help the brain with functions it's not very good at like calculating large numbers.
        - Image streaming from your eyes, photography, movies like in one of William Gibson's books, i think it was Monal Lisa Overdrive, you wouldn't need to be an artist to recreate an image you have
      • by Kjella (173770)
        The most interesting one to me is the ability to supplement your own faulty memory with a hard drive and your own thinking power with a processor. You'd take a little snapshot of every person you met and file it away with their name, never to be forgotten. Think about what school would do for you! If I could remember all the science, history and literature I've been taught and forgotten, I'd be a much more educated guy than I am now.

        Not to go all Trinity on you, but why limit it to your own experiences? Bas
      • I wonder if you'd be able to tell your own conclusions from what was being fed to you?

        Given the number of people today who can't do this, I fail to see your point.

    • Re:What if... (Score:2, Insightful)

      by kko (472548)
      Yeah, cool, well, actually getting any sort of input to the brain seems to be a big part of, if not THE actual problem.

      Jeez.
  • But what if we're already in the matrix.. that would be a direct neural interface inside a direct neural interface... Talk about a mind bender.. or should I say 'spoon bender'?
    • But what if we're already in the matrix.. that would be a direct neural interface inside a direct neural interface... Talk about a mind bender.. or should I say 'spoon bender'?

      Or more likely just Bender: "We're boned!"

  • I discussed this with Peter Donaldson of the Neurological Prosthesis Unit in South London, it's like packet switching the British invent it but don't fund it and the Americans take it on, fund it and get all the money and glory, oh well.
  • One of the original Memory Stick ads had a guy with a card slot in the back of his head and a Stick was about to be placed in it. At the time, I though Sony was subconsciously telegraphing where they'd like all of this to go. These days, I'm certain. What's really scary is the sheeple will go for "Neural Rights Management" if it means he gets to watch Survivor 15.
  • Disclaimer: I am currently drunk, so the following comments may seem a little more disjointed than usual.

    I remember when I was playing Shadowrun, and delved into Cyberpunk 2020, and loving the idea of having a character who could directly interface to a computer - in Shadowrun it was via a "datajack", located directly behind the ear and mounted in the hard skull tissue for maximum anchorage.

    The idea is not new. I remember reading about a guy called "Jerry" who'd had a special series of wires - I th

  • Boon for Camouflage (Score:2, Interesting)

    by schwit1 (797399)
    Could you test potential camouflage patterns with this and find which cause the most difficulty in visually deciphering? Or one day have computers generate camouflage on the fly based upon the surroundings.
  • Code Talkers (Score:3, Interesting)

    by Doc Ruby (173196) on Saturday November 12, 2005 @10:55AM (#14015225) Homepage Journal
    It's not a "code". There's no objective reality that the brain is decoding for mere "referential integrity". The brain is organizing its responses to incoming sensory info, in a feedback loop with itself, including resonating "memory" response signals. Sure, object representations are recognized as repeats of previous object representations, and dispatched to brain areas sensitized to those representations. But it's not like objects outside the body have standard codes, the same from person to person, like say insulin has in our DNA. That would be way to static for us to survive in this changeable world. We're making it up as we go along, and living in the reality we generate. The closer our mind's model matches the world we encounter, the smarter we are.
  • is not my friend.

    The world is already a giant hologram where you can do or undo whatever you feel like.

    Plugging your head into an artificial world is like wanting to play space-invaders on a simulated computer interface inside a game of Quake. No thank-you. We already have the perfect interface out here where the graphics and sound are of the highest quality and there is no chumpy, 'Save' button to make things boring. And there are plenty of cheats keys in the structure of reality if you have the courage
  • by Anonymous Coward
    I'm getting a bit tired of MIT getting press for research that has already been done years ago. In this case in particular, see the Dobelle Institute: here [cbsnews.com], here [erc-assoc.org], and here [artificialvision.com], for instance.

    Seriously. Don't exacerbate the inflated delusions of these guys by pretending that their research is unique or "cutting-edge". Expect more of them.
  • it's not DNI at all (Score:2, Informative)

    by darkeye (199616)
    as they don't actually connect to the neurons, but read the neuron acticity patterns, probably through fast MRI scanners. and there's no feedback either - they don't send any data to the neurons (other then through the natural eye of the monkey in the tests)
  • I'm not really as concerned with the implications of being able to inject images into the human brain, while that may be somewhat useful. It's likely the visual cortex may have many subtle differences as well between human and chimpanzee brains--- so this is likely to be a much more difficult set of technology to translate for human use. What's interesting about this is the fact they're claiming that an incredibly complicated set of algorithms, that have been evolved over billions of years in our brains, ca
  • Wow this reminds me of a book I read called Nanotime. Pretty killer read, too. I just hope we don't have to be fully awake when they insert this into our brains, like in the book!
  • Almost identical information is located in one of my college psychology text books from 2002. The book is called Sensation and Perception, but I'm sure that just about any textbook on the subject of neurophysiology would cover this. The research involved is probably a few years older than the book.

    Recording the activity of hundreds of IT neurons produced a large database of IT neural patterns generated in response to each object under many different conditions.

    This translates to a cortical probe(maybe sub

  • Could parallel research and development will enable recognition of audio 'objects' as well? It might help open the stubborn door to voice recognition.

    In the short term, I suspect there would be more immediate applications for voice recognition than for visual object recognition, though I am still pulling for these guys if it leads to cars driving themselves.

  • ...but on human subjects using fMRI. This research really isn't related to the matrix or DNI's directly, it's about seeing whether or not electrical signals from the brain contain enough information for a classifier (ironically, in our case, artificial neural networks) to distinguish between some subjective cognitive state.

    Considering the progress we've made in distinguishing cognitive states (is this person looking at a face, a house, a squirrel, etc?) in human subjects using fMRI (an extremely noisy da

  • Holy crap! Nerddom just took a giant step towards the Sci-Fi fantasy of experiencing actual sex by playing back some factory worker dude's recorded memories of banging that cheerleader you were always staring at!
  • having a direct neural interface connector implanted in the back of our heads will bring new meaning to the term "jack off".
  • "Possible development"
    "could eventually allow"
    "computer algorithms"

    A computer connot possible process information the way the brain does.
    Any DNI is a transducer and traanslator because the two things operate very differently. Why the hell try to copy how the brain does it on a machine that can't? Try to improve on the process by making the computer do what it does best instead.

    Object recognition via similarlity calculation has been available in holographic storage devices. It is inherent in their operation

Administration: An ingenious abstraction in politics, designed to receive the kicks and cuffs due to the premier or president. -- Ambrose Bierce

Working...