Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Biotech Medicine Science

The Blind Shall See Again, But When? 226

An anonymous reader writes "Restoring hearing with cochlea implants that replace the inner ear with an electronic version has become standard procedure for many types of deafness. Now it looks like the same thing might happen for many types of blindness. With five national labs funded by the Department of Energy, this third-generation artificial retina promises to enable the blind to see again soon. Already it has been successful in over a dozen test patients, but at resolutions too low for doing much more than proving the concept. However, if the DoE can perfect this larger version of an artificial retina, then the company Second Sight promises to commercialize the implant, aiming for VGA resolution within the decade."
This discussion has been archived. No new comments can be posted.

The Blind Shall See Again, But When?

Comments Filter:
  • by caffeinemessiah ( 918089 ) on Friday February 19, 2010 @04:37PM (#31203684) Journal
    If they achieve VGA resolution, it's a steady road to full vision for the blind. I'm more interested in, at this point, exceeding human abilities. Think of the case of HDR imaging [wikipedia.org] -- we currently don't have monitors (most of us at least) that are high dynamic range themselves, so images have to be "tone-mapped" to the dynamic range of our monitors, which often results in those ridiculously sharp but somewhat "unrealistic" [flickr.com] pictures you see on Flickr.

    It would be cool if, say, the IR spectrum or just more dynamic range in the visible spectrum could be tone-mapped to human perception in this way, resulting in perceptually sharper images by way of a direct retinal implant.

  • by Tubal-Cain ( 1289912 ) on Friday February 19, 2010 @04:37PM (#31203686) Journal
    Try working on a VGA/DVI/HDMI/DisplayPort/whatever input, too. Bypass monitors altogether.
  • Blindness Sucks (Score:5, Interesting)

    by techsoldaten ( 309296 ) on Friday February 19, 2010 @04:40PM (#31203732) Journal

    My Dad just had a stroke and has no perception on the left side of his body.

    All I have been thinking about the last month is how to do something like this, set up something that can do motion detection and help him avoid collisions.

    You know, I would go for low resolution versus no resolution right now.

    M

  • by Anonymous Coward on Friday February 19, 2010 @04:40PM (#31203740)

    Why stop at the ir spectrum, why not go full spectrum? Maybe with a remote control. Make Geordi's visor seem like a toy. How much information can we cram into the visual cortex?

  • by Anonymous Coward on Friday February 19, 2010 @04:40PM (#31203744)

    How often does everybody else stop and say to themselves, "Holy crap. We're living in the future!" I've been doing that at least once a week since the beginning of the year.

  • Re:DoE? (Score:3, Interesting)

    by binarylarry ( 1338699 ) on Friday February 19, 2010 @04:43PM (#31203776)

    It'll be interesting when they start offering bonuses to any military staff who opt in to a "Predator Vision" program.

  • by St.Creed ( 853824 ) on Friday February 19, 2010 @04:44PM (#31203798)

    This reminds me of a small girl we met at the swimmingpool (lessons), who had one visible cochlear implant. This girl turned out to be deaf from birth on both ears. I remarked to her mother that she could actually hear and talk amazingly well - I hadn't noticed anything in her speech. According to the doctors this was nigh impossible, but she had enough input from the 16 nerves to get perfect speech and reasonable hearing. She probably got very lucky with the connections on the nerves. So even with 16 nerves stimulated this could make a huge difference for someone who's blind, if they happen to hit the right connections.

    Yeah I know - anecdotal evidence and such. Still, I'm happy they get this far already.

    Oh, and I won't be upgrading my retina unless it matches the resolution of my computer display and comes with infrared, zoom and millimeterwave vision options. Preferably with scrolling 6502 assembly code on the left side as well :P

  • by Anonymous Coward on Friday February 19, 2010 @04:44PM (#31203800)

    It's interesting that these visual implants directly stimulate the retina to send signals to the nervous system, while even the advanced cybernetic limbs such as DARPA's "Proto 2" are still using the kludge of reading electrical signals from muscles. As I understand it, the arm research is meant to eventually hook the limbs up directly to nerves (as has been done successfully, to some extent, with biological hand transplants), but the tech isn't quite there yet.

  • Re:DoE? (Score:1, Interesting)

    by Anonymous Coward on Friday February 19, 2010 @04:49PM (#31203876)

    DoE national labs do quite a bit of basic material research science. Since the first and second gen were basically implanted solar cells (silicon substrate that reacted to light) it makes sense.

  • At some point, we should be able to modify perception via EM, so no need for implants. Disrupt the optic nerve and feed it artificial stimulation via a headband or similar, and provide a full immersive view. Ditto the other nerves, and you have immersive, convincing VR complete with non-tactile sensation....

  • by abbynormal brain ( 1637419 ) on Friday February 19, 2010 @04:55PM (#31203976)

    ... in military application? Robo-cops, emergency responders, and others of similar categories of future application will most definitely benefit from advanced imaging.
    HUD capabilities as well -- non-disruptive arrows near the peripheral regions of your vision guiding you to the nearest McDonalds when you ask for it. It won't stop there, "Aps" for your new vision capabilities will spring up -- virtual retinal compass, retinal level (yes, you only need two hands to make sure that picture frame is straight), and the list goes on. Oh, and don't forget the ever loving popular - pop-ups.

  • by TiggertheMad ( 556308 ) on Friday February 19, 2010 @04:57PM (#31204002) Journal
    My Dad just had a stroke and has no perception on the left side of his body.

    Hmmm, but this isn't really blindness resulting from eye damage is it? It sounds to me like his problem is that the signals coming out of his left eye are being mapped into damaged brain tissue. It sounds like he just needs a new 'optical data input port' installed in his brain.

    It sounds so trivial, doesn't it? Just rerouting a few electrical impulses around a damaged network node...
  • by natehoy ( 1608657 ) on Friday February 19, 2010 @05:16PM (#31204248) Journal

    Actually, quite a lot, as long as we are willing to give up accurate color perception in the spectrum we see in now. The human visual system can differentiate, say, ten million colors (guesstimate). That's across a very small band of the spectrum we could make visible if we chose to. Index the new frequencies to perceived colors and we might be able to differentiate a few hundreds of thousands of colors in our currently-visual spectrum, but we'll also be able to differentiate various frequencies of ultraviolet and infrared light. So, for example anything in shades of blue represents UV light, and anything in shades of red represents IR, and the colors we see today are perceived as little more than shades of grey with a blue or red tint.

    I, for one, would gladly give up the ability to differentiate eggshell from ecru if it meant I could see in the UV and IR spectra, though I strongly suspect the transition would be best done slowly. That much new unfamiliar input introduced all at once might have profoundly unfortunate effects on the human psyche...

  • by je ne sais quoi ( 987177 ) on Friday February 19, 2010 @05:16PM (#31204250)
    That was my gut reaction too but I learned something reading the wikipedia page on cochlear implants [wikipedia.org]:

    If a child is placed into a mainstream setting it makes it difficult for them because they feel like they do not fit in with their peers and cannot fully identify with the Deaf community. One interviewee in the Christiansen and Leigh study states "In high school it was the worst time for me with the cochlear implant because I was really trying to find my identity with the cochlear implant...I never accepted my deafness. And the cochlear implant in some ways showed me that no matter what, the moment I take it off I'm deaf. I'll never be hearing 24 hours." [37]

    I'm not deaf but I think that there is enough a community for deaf people that they have a cultural identity of being deaf. By implanting children with the device, they are no longer in that culture, but neither are they a "normal" fully hearing person, even when they have the device plugged in. This may actually lead to a lower self-esteem for the child than if they were surrounded by people like them (i.e. deaf). But then again, teenagers or children who don't fit in or feel inadequate for any reason are as common as grass since schools and children tend to try and enforce sociological homogeneity, it doesn't matter if you wear thick glasses, are socially maladjusted, or have any other issue that makes you different from the "average" kid.

    As for black people, I think the GP needs to learn a bit about skin tone [google.com] discrimination amongst african americans and asians before he starts shooting off about skin lighteners and their evilness. Even americans of european descent do it, ever hear the term "redneck"? It immediately conjures a picture in one's mind of someone who is often poorly educated and poor financially and is often overweight.

  • by Anonymous Coward on Friday February 19, 2010 @05:19PM (#31204284)

    What i'm surprised at is this doesn't appear to have come from military research.
    You'd think something like this would have been high in the priority list of enhancements.
    I guess IR / NV / UV headsets were just more efficient use of money and time.

    There is that other one that sounded quite promising and wouldn't require surgery, the device that just gets placed on to the tongue.
    I believe this one was actually being looked in to by the military.

  • by Estanislao Martínez ( 203477 ) on Friday February 19, 2010 @05:30PM (#31204410) Homepage

    The premise of this submission is that cochlear implants are uncontroversially good, but that just ain't so; there's a lot of people who have objections to cochlear implants themselves or the way they're pushed on to deaf children.

    The National Association of the Deaf's statement on the implants [nad.org] makes pretty good reading about this topic. They don't come against the implants as their own, but they do point out a number of problems that they perceive on their use:

    1. The implants are pushed on to parents of deaf children as a "cure" for deafness, when they are at best a tool for deaf people to navigate a hearing world.
    2. The promotion of the implants often comes along with a negative image of deafness, which portrays deaf people as deficient and unable to communicate. The NAD would rather prefer that deaf people be represented by positive role models of successful deaf people.
    3. The implants require years of very frustrating training for many deaf children to learn to use, and a lot of that time might be better spent on sign-language based education.

    I don't know to what extent this would be a factor for blindness, however. It might well be completely different, because blind people can speak and understand spoken language, so they don't have the same developmental risks that pre-lingual deaf children are subject to if they don't have the chance to learn a full language.

  • by ZuchinniOne ( 1617763 ) on Friday February 19, 2010 @05:47PM (#31204692)

    During world war 2 some soldiers were given a form of vitamin A that slightly changed the structure of the opsin molecule which the eye uses to detect light.

    This resulted in soldiers being able to see further into the red end of the spectrum and there are some reports that a few soldiers even saw the top of the infrared spectrum.

  • by natehoy ( 1608657 ) on Friday February 19, 2010 @05:54PM (#31204818) Journal

    It depends on how it's implemented. Remember, we're taking the actual eye and replacing it. You've got to first reproduce the existing signal, then you've got to figure out how to map any new signals we don't already send.

    First, we have to throw out practical limitations on current technology and say we've reached a point where we can accurately reproduce the full resolution of human sight AND we have a sensor that can detect ten million discrete colors per pixel, and that we've found a way to tap into the brain's cortex and send all that data in realtime.

    So let's say a specific rod or cone received values we mapped from -5,000,000 to +5,000,000 (I realize color is a lot more complex than that, because we have multiple hues intensities, but a linear scale is a lot easier to deal with for discussion).

    Now we have to figure out how to make that receptor detect -20,000,000 to +20,000,000 to handle limited sections of the UV and IR spectra. The easiest way is to map 0-5 as 1, 6-10 as 2, etc. So we have the same actual number of possible color inputs, we're just making them less precise and spreading them out over a larger range of possible values. You'll "see" the same number of colors, but spread out over a much larger band.

    Maybe our brains can handle more colors than our visual cortex can currently send. If so, there's a lot of dormant brain cells that are going to come on-line when this data comes in, because we have a lot more data to process to make an image. Either that or our brains will do what they do anyway - interpolate.

    More likely, we'll have to map the new data to a somewhat-similar-to-current range so our brains can handle it.

  • by NotBornYesterday ( 1093817 ) on Friday February 19, 2010 @06:09PM (#31205016) Journal
    Does lack of IR/UV vision stem from a lack of proper optical reception (cones), or lack of neural ability? My guess is that the brain would try and interpret what it is shown, regardless of what our eyes have evolved to do.
  • by Hal_Porter ( 817932 ) on Friday February 19, 2010 @06:15PM (#31205084)

    Oh crap, the upgrade makes the world roll on my fixed frequency NHS eyes.

  • by dido ( 9125 ) <dido&imperium,ph> on Friday February 19, 2010 @06:19PM (#31205134)

    Well, one could also use the imaginary colors [wikipedia.org] that correspond to those particular combinations of cone cell responses in the human eye which cannot be produced by any physical source of light. The human eye has three types of color-sensitive cone cells, short-wavelength (blue), medium-wavelength (green), and long-wavelength (red). The trouble is, the spectral sensitivity of these three types of cone cells overlap, so any physical source of light would probably excite at least two, most likely all three types of cones at once, to greater or lesser degrees. The upshot of this is that are some combinations of cone cell responses that cannot be produced by any physical source of light. For example, a hypothetical light source that excited only the medium-wavelength cones would correspond to a shade of very deep green, but such a light source would require a spectral power distribution with positive power in the green area of the spectrum and impossible negative power in the red and blue areas.

    In short, with artificial color receptors it may be possible to simulate the cone cell responses that would have generated imaginary colors to mean wavelengths outside of the normal human visual spectrum. Or alternatively change color perception entirely with non-overlapping spectral-sensitivity curves that cover a much wider band of the electromagnetic spectrum. You'd perceive color quite differently from normal people in this case though, and that might cause trouble.

    We tend to forget that color doesn't really have a physical reality. It's an ongoing philosophical debate whether color is actually a feature of the world we perceive or a feature of our perception of the world. Replacing our sense organs in this fashion is sure to add more fuel to this debate.

  • Re:DoE? (Score:3, Interesting)

    by Surt ( 22457 ) on Friday February 19, 2010 @06:19PM (#31205138) Homepage Journal

    But by eliminating the need for artificial lighting with superior eyes, they could get rid of 38% of the US energy usage. Frankly, nothing they can possibly do will have any substantially better yield than that.

  • by Anonymous Coward on Friday February 19, 2010 @07:13PM (#31205718)

    I, for one, would gladly give up the ability to differentiate eggshell from ecru if it meant I could see in the UV and IR spectra, though I strongly suspect the transition would be best done slowly. That much new unfamiliar input introduced all at once might have profoundly unfortunate effects on the human psyche...

    You actually can see very faintly in IR. If you wear visible-spectrum opaque, but IR-transparent glasses, you can maneuver through the environment just by its heat output. It's dark, but doable.

  • by evilWurst ( 96042 ) on Friday February 19, 2010 @07:35PM (#31205968) Journal

    If you read the linked article, though, they don't see more spectrum: their extra receptors are in between red and green. In other words, they see the difference between certain shades or color more accurately than the rest of us, but they don't see any "new" colors that the rest of us can't see.

  • by poopdeville ( 841677 ) on Friday February 19, 2010 @07:43PM (#31206036)

    It depends on how it's implemented. Remember, we're taking the actual eye and replacing it. You've got to first reproduce the existing signal, then you've got to figure out how to map any new signals we don't already send.

    Eh, no. You just send them, and let the brain learn how to decode the signals, just like your brain did when you were an infant. You LEARNED to see in the first place.

  • by Anonymous Coward on Saturday February 20, 2010 @12:16AM (#31207714)

    This is not how color vision works. It works by differential analysis between sections of the eye, by some very fascinating positive and negative feedback loops in the processing cells of the retina. Unfortunately, these retinal electrodes are so large, and the current spread so much, that the signals *swamp* all the local processing cells. Also unfortunately, if you make the electrodes smaller, you increase the resistance and the current density, until you start electrolyzing the vitreous humor trying to localize signals.

    We won't get good visual prostheses until and unless we can get the electrodes onto the optic nerve itself, with far more resolution. For examples of approaches that could conceivably work, take a good look at David Edell's work, at http://www.mrs.org/s_mrs/sec_subscribe.asp?CID=2603&DID=110240&action=detail [mrs.org]. But his work would involve avoiding the retina and stimulating the optic nerver: so far, it's really only suitable for muscle nerves.

  • Re:Blindness Sucks (Score:3, Interesting)

    by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Saturday February 20, 2010 @12:36AM (#31207828)

    There's actually a direct spacial mapping from steradians in the visual field to particular areas on the surface of the visual cortex. Under each "pixel" on the surface, if you will, there are several physical layers that each have a specialized function: one detects lines; another circles; another changes in perspective; and another compensates for white point balancing. These layers then send the processed signals to another portion of the brain for interpretation. (It's not a bad architecture, actually.)

    Many blind people can still "see" with their memories and their imaginations. What happens in this context is that recorded (or synthesized) sensory inputs are fed back into the same areas that process the higher-level processed signals from the eye's "live" feed. Memory, really is a process of re-perceiving.

    It seems plausible that computers could take over the function of not only the retina, but also the visual cortex and send high-level processed signals directly to the area of the brain responsible for interpreting them.

    Hell, that might be better than normal vision. Imagine knowing more colors [guardian.co.uk] than we are able to naturally perceive, or being able to "see" arbitrarily fine details, as if in a dream. Augmented reality [wikipedia.org] would be trivial.

    All that and more might be possible if we bypass the visual cortex.

It appears that PL/I (and its dialects) is, or will be, the most widely used higher level language for systems programming. -- J. Sammet

Working...