Forgot your password?
typodupeerror
Biotech Medicine Science

The Blind Shall See Again, But When? 226

Posted by timothy
from the better-repaint-their-room dept.
An anonymous reader writes "Restoring hearing with cochlea implants that replace the inner ear with an electronic version has become standard procedure for many types of deafness. Now it looks like the same thing might happen for many types of blindness. With five national labs funded by the Department of Energy, this third-generation artificial retina promises to enable the blind to see again soon. Already it has been successful in over a dozen test patients, but at resolutions too low for doing much more than proving the concept. However, if the DoE can perfect this larger version of an artificial retina, then the company Second Sight promises to commercialize the implant, aiming for VGA resolution within the decade."
This discussion has been archived. No new comments can be posted.

The Blind Shall See Again, But When?

Comments Filter:
  • Re:Blindness Sucks (Score:1, Informative)

    by Anonymous Coward on Friday February 19, 2010 @04:47PM (#31203834)

    My sympathies for your father's condition (I have had family members who have had strokes, and it's really quite sad and humbling to witness the consequences), but let's be realistic. I'm no doctor, but I'd imagine blindness after a stroke is neurological. Making modifications to the retina isn't going to fix issues in the brain.

    So with that in mind... Blindness doesn't have a single cause, and even if we take some steps in the right direction, it won't be a silver bullet.

  • Re:Blindness Sucks (Score:3, Informative)

    by hamburgler007 (1420537) on Friday February 19, 2010 @04:50PM (#31203890)
    Unfortunately, stroke induced brain damage is likely the result of brain damage than damage to the retina.
  • Re:Whoa (Score:3, Informative)

    by binarylarry (1338699) on Friday February 19, 2010 @04:57PM (#31203998)

    [Johnny and Jane have just broken into the computer warehouse]
    Johnny Mnemonic: [swipes a pile of circuit boards and components off the desk and says to no one in particular] I need a Sino-logic 16.
    Jane: [runs around the computer warehouse finding everything he calls for]
    Johnny Mnemonic: Sogo 7 Data Gloves, a GPL stealth module, one Burdine intelligent translator... Thompson iPhone.

  • by SOdhner (1619761) on Friday February 19, 2010 @05:03PM (#31204064) Homepage Journal
    I don't think so - I've met people with this opinion in person, one of whom felt so strongly about it that she flat out said if she had a child who was born deaf and knew it could be immediately fixed she would decline, even though this would be someone that was never even part of the deaf culture to begin with.
  • by Fyzzler (1058716) on Friday February 19, 2010 @06:18PM (#31205116)
    There are already people out there who can see more than the normal human spectrum.
    Tetrachomacy [wikipedia.org]

    So I think the potential is already there.
  • by timeOday (582209) on Friday February 19, 2010 @07:08PM (#31205668)
    I am really curious what resolution you would need to simulate human vision. Not that many. Our vision is really terrible outside a tiny area (the fovea). We only have 6 or 7 million cones, and those have well under a pixel's worth of information each (they're monochromatic, for one thing, and several might have to fire together to be perceptible - I don't know).

    I'm pretty sure you don't need, for example, the 15 megapixels that a modern SLR gives you; the reason you need so many in an image or a monitor is because you can look anywhere in it, so it has to match your maximum resolution everywhere, even though you can only see a tiny bit of it at once. (This is massively wasteful, so you can achieve great compression if you know where people are looking [nyu.edu].)

  • Re:DoE? (Score:4, Informative)

    by the gnat (153162) on Friday February 19, 2010 @08:24PM (#31206378)

    Yeah, but shouldn't they pass on their research work to another, more appropriate Department?

    Senior academic scientists don't "pass on their research" unless they're exceptionally well-paid for it, or retiring. To do otherwise would be career suicide.

    To answer the original question: there are a variety of reasons why the DoE maintains other research programs that don't appear at first glance to be related to energy. One is that it's useful to have a sustainable and adaptable academic culture - for instance, the DoE is now putting a great deal of effort (and money) into biofuels, which is both directly related to the core mission of the Department, and dependent on biologists of every kind. If the DoE were strictly limited to physicists, synthetic chemists, and engineers, no one in the organization would have a clue about how to go about starting up a biological research program. You can always hire outsiders, but it is nice to have in-house expertise.

    Another reason is that the very nature both of science and of the DoE labs inherently introduces some mission creep. Because they have always done defense-related work as part of the nuclear weapons program, ever since the Manhattan Project, they have branched into other defense-related areas. The DoE is also probably the world's largest operator of particle accelerators, which have a variety of uses. At some point in the last century, someone figured out that a particular type of electron accelerator called a synchrotron (which the DoE has several of) was most useful as an X-ray generator. As a result, protein crystallographers - biochemists - are some of the most active users of DoE facilities. (This was my background, and I now work for the DoE.) More recently, they've started to work on X-ray lasers, starting with the old Stanford LINAC, and the hope is that these will make possible many new experiments in multiple fields.

    (Keep in mind, the time span over which new methodologies develop is typically multiple decades. The first protein crystallography experiment was in 1937; the first cyclotron was invented in 1929. No one actually solved a protein structure with X-rays until the early 1960s, by which time synchrotrons had been invented. It took another 20-30 years to realize the application of synchrotron X-rays to biology, and another 20 years for their use to become standard. It isn't simply a case of government bureaucrats searching for new fields to move into - although that happens occasionally too. Basic research is often inherently undirected and directionless, and you don't necessarily know where you're going to end up when you start.)

    Finally, don't assume that the funding comes entirely from the DoE. The research group that I work for is mostly based at a national lab, but our funding comes almost entirely from the NIH and sponsoring companies.

"Just the facts, Ma'am" -- Joe Friday

Working...