Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Medicine Biotech

Artificial Brain '10 Years Away' 539

SpuriousLogic writes "A detailed, functional artificial human brain can be built within the next 10 years, a leading scientist has claimed. Henry Markram, director of the Blue Brain Project, has already built elements of a rat brain. He told the TED global conference in Oxford that a synthetic human brain would be of particular use finding treatments for mental illnesses. Around two billion people are thought to suffer some kind of brain impairment, he said. 'It is not impossible to build a human brain and we can do it in 10 years,' he said."
This discussion has been archived. No new comments can be posted.

Artificial Brain '10 Years Away'

Comments Filter:
  • by basementman ( 1475159 ) on Thursday July 23, 2009 @01:05AM (#28791733) Homepage

    I'm still waiting on the scores of cancer cures that have been promised over the past decade. Talk is cheap.

  • Re:don't believe it (Score:5, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Thursday July 23, 2009 @01:05AM (#28791735) Journal
    I assume that we'd basically adopt a strategy of "enlightened plagiarism": use our (nontrivial) imaging and structural analysis technology to get the best idea we can of the structure of a real brain(without necessarily understanding what it does, or why it is structured as it is). Simulate that structure. If it acts like a real brain, break out the party hats. If it doesn't, try to figure out why, tweak, and try again.

    Being able to build very complex models, based on what we do know, would be extremely valuable in telling us whether or not we are looking at the right structural details, and whether or not we are missing something(and, if so, the difference between our simulation, and the real thing).
  • by Anonymous Coward on Thursday July 23, 2009 @01:09AM (#28791753)

    Saying that there are morals involved with experimenting with an "artificial brain" that's just a computer program - that one state is more moral than another - is superstitious nonsense. It's just bits, FFS. Do you also believe that 13 isn't just a number, that it's "unlucky"?

  • Re:10 years? (Score:5, Interesting)

    by setagllib ( 753300 ) on Thursday July 23, 2009 @01:15AM (#28791801)

    It's very simple to see why this happens. When you start a project, or even just a stage of a project, you have some list of problems and you may even have some idea of the solutions. You can use good judgement to estimate the time it takes (at least to some order of magnitude), and rounding off to 10 years makes for good press.

    But when you actually begin the work, every problem you solve illuminates a whole new set of problems to solve. If each solution opens up more than one new problem, you've "increased" the amount of work left to be done. So either you cut back on some of the goals (to reduce the list of problems) or you admit it wasn't as simple as you thought and announce a new project to tackle some subset of the new set of problems.

  • by mcrbids ( 148650 ) on Thursday July 23, 2009 @01:24AM (#28791859) Journal

    "Human rights" aren't terribly well grounded, theoretically; but to the degree that they are, mental complexity seems to be a vital factor(given that we don't generally execute retarded people, it isn't the only one, but it is a big one). Being made of meat isn't obviously a salient factor, nor is being born to human parents.

    Our all-seeing, all-wise, all-knowing creator has forseen this, having created the stars in the heavens, and the children on the earth, and the trees in the mountains, and the birds of the air. He hath forseen this, and so hath written: "Thou shalt have no other gods before me!". There's your answer. It's somewhere between the "Thou shalt kill people who work on Sundays" part, and the "Hey neighbors, don't mess with my buddies here, have at my virgin daughters..." part, if I remember correctly...

    I have every confidence that, armed with this wisdom bestowed upon us by an angry, merciful, wise, and loving god as this, we shall have no trouble at all dealing with the ethical dilemmas brought upon us by the singularity.

    If you download your brain into a robot and turn it on, then take an axe to it, are you killing yourself? If not, would the robotic copy of you that was seeing the axe come down agree with you?

  • by miggyb ( 1537903 ) on Thursday July 23, 2009 @01:30AM (#28791889) Homepage
    I'd argue the opposite. I don't think being human has anything to do with the outer shell. I, for one, use my body as a way to get my head to important places. A virtualized brain would still be self-aware and capable of having real, human emotions, in exactly the same way you or I do.
  • Re:don't believe it (Score:5, Interesting)

    by Jurily ( 900488 ) <jurily&gmail,com> on Thursday July 23, 2009 @01:42AM (#28791969)

    The brain is a self-modifying learning machine. Until you can build a self-modifying learning machine, you can have all the structure you want, it won't be functionally equivalent to a human brain.

  • by enFi ( 1401137 ) on Thursday July 23, 2009 @01:44AM (#28791983)
    Moreover, if the brain is simulated well enough, it will certainly appear self-aware. Even if there is a difference (such as it not having a soul), that's not something we can (so far) experimentally determine, and therefore any metaphysical postulations are, or should be, beside the point in the question of ethical behavior towards the simulation.
  • Re:don't believe it (Score:1, Interesting)

    by Jurily ( 900488 ) <jurily&gmail,com> on Thursday July 23, 2009 @01:52AM (#28792027)

    Do we know enough to say that with confidence?

    Tell you what: tell me how that thing with the car keys works (you know, the one where you look at the table three times and it isn't there, you search for it for 10 minutes elsewhere, and suddenly you see it right there where you looked before), and I'll believe you.

  • Re:10 years? (Score:3, Interesting)

    by buchner.johannes ( 1139593 ) on Thursday July 23, 2009 @01:59AM (#28792077) Homepage Journal

    "The future is already here - it is just unevenly distributed. " -- William Gibson
    People are still awaiting ubiquitous computing to come, but for some countries (Singapure, Korea), it is already here. G Bell [uci.edu]

  • Humans are different (Score:1, Interesting)

    by JakartaDean ( 834076 ) on Thursday July 23, 2009 @02:03AM (#28792095) Journal

    If you download your brain into a robot and turn it on, then take an axe to it, are you killing yourself? If not, would the robotic copy of you that was seeing the axe come down agree with you

    I'm old school, I guess, but I think there is an unbridgeable chasm between computer software and human intelligence. We have no problem killing (relatively intelligent) pigs, for example, for food. I would put bits and bytes as much lower on my own personal "value scale" than pigs. I simply cannot believe that a computer program is worthy of respect as a life form. I know this idea has been popular on sci-fi shows the last few decades, but I don't get it all. Although I'm a die-hard atheist, I distinguish between a living being and a program, and don't believe a computer program can feel pain. This is all bullshit, as far as I can tell. Dean

  • Re:don't believe it (Score:2, Interesting)

    by AmigaHeretic ( 991368 ) on Thursday July 23, 2009 @02:06AM (#28792121) Journal

    Maybe we can build the *equivalent* of a human brain (number of neural connections in software, silicon or combination), but we don't even know how the thing functionally works as it is. How are we going to model it?

    Hi-Resolution MRI. Just scan someones real brain and then load it onto the computer. We don't even need to know how a 'real' brain works.

  • Re:Awesome (Score:1, Interesting)

    by Anonymous Coward on Thursday July 23, 2009 @02:10AM (#28792143)

    All we have to do is wait for the "brain" computer to blue screen... They're your zombie horde in the making.

  • Re:$10,000 (Score:5, Interesting)

    by Tubal-Cain ( 1289912 ) on Thursday July 23, 2009 @02:14AM (#28792165) Journal
    Put it here [longbets.org] instead.
  • by JuzzFunky ( 796384 ) on Thursday July 23, 2009 @02:19AM (#28792189)
    Consciousness is more than just a mental state. It is a state of being. A key component to emotions is that they emerge as a direct result of physical embodiment. For example, the emotional state of fear [wikipedia.org] feels like it does because of the way our bodies react when they are frightened.

    Fear is often preceded by astonishment, and is so far akin to it, that both lead to the senses of sight and hearing being instantly aroused. In both cases the eyes and mouth are widely opened, and the eyebrows raised. The frightened man at first stands like a statue motionless and breathless, or crouches down as if instinctively to escape observation. The heart beats quickly and violently, so that it palpitates or knocks against the ribs... That the skin is much affected under the sense of great fear, we see in the marvellous manner in which perspiration immediately exudes from it... The hairs also on the skin stand erect; and the superficial muscles shiver. In connection witih the disturbed action of the heart, the breathing is hurried. The salivary glands act imperfectly; the mouth becomes dry, and is often opened and shut.
    - Charles Darwin The Expression of the Emotions in Man and Animals

    I do not doubt that an artificial brain could become self aware but for it to experience real, human emotions it would need to be embodied in an equivalent way.

  • by twostix ( 1277166 ) on Thursday July 23, 2009 @02:27AM (#28792227)

    And if this silicon brain decides that it's had enough of being experimented on?

    And what if they don't turn your "pain receptors" off? What if they specifically want to experiment on you to see how much pain you can endure? If you think that medical scientists don't often do brutally unethical experimentation on "lesser" humans you'd be very very wrong (though since the 90's it's gotten much better in the west). As if they're going to care about a brain that they *created*. In fact I can see that as a selling point "see we can do these horrid experiments on this artifical brain so that we don't have to do it on orphans, prisoners and the institutionalised - like we used to".

    Then again if you were regarded as a sentient being would they then have to keep you alive for the rest of eternity lest they be charged with murder if they turn you off or delete you?

    If you create a sentient being you have a responsibility to that being and no you can't just kill it if you get bored with it or it just doesn't meet your expectations, otherwise there would be a hell of a lot more infanticide.

  • Re:don't believe it (Score:5, Interesting)

    by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Thursday July 23, 2009 @02:49AM (#28792331) Homepage

    Tell you what: tell me how that thing with the car keys works (you know, the one where you look at the table three times and it isn't there, you search for it for 10 minutes elsewhere, and suddenly you see it right there where you looked before), and I'll believe you.

    What's so special about that? The human eye can only see a very tiny fraction of your field of view in focus, everything else is very blurry and pretty much impossible to recognize unless you already know its there. On top of that your eye has a blind spot, everything in that is completly invisible. Your pattern recognition also doesn't work 100% perfect, if you see something upside down instead of the way you expect it, you might not recognize it or not recognize it fast enough and so your eyes might have moved on before the key was recognized.

    Or to sum it up: The brain actively recognizes only a very tiny fraction of the world, everything else is interpolation and guesswork and if your key hides in the later part, you won't find it, especially if you don't expect it there. Seen this [youtube.com]? Pretty much the same thing.

  • Re:don't believe it (Score:3, Interesting)

    by ikkonoishi ( 674762 ) on Thursday July 23, 2009 @02:57AM (#28792371) Journal

    MRI can't get high enough resolution. You need to be able to image it on a molecular level. MRI would just tell you the structure of the brain. Its like saying you could play a copy of a video game if you had an accurate listing of the files in the game directory.

  • what? (Score:4, Interesting)

    by Anonymous Coward on Thursday July 23, 2009 @03:15AM (#28792485)

    So, I have all these 10000x10000 TIFFs I just took of a real brain. Now what?

    Guess what I mean is, the brain is not the same from a minute to the next. It modifies itself constantly. We may be able to copy the parts (although I'm pretty sure we're more than 10 years away from that) but until we can make it "run", all we have is a stopped engine. What good would that do?

    Unless what we want is a brain _model_, which is what I think is meant by the article.

  • by TheLink ( 130905 ) on Thursday July 23, 2009 @03:20AM (#28792519) Journal

    The transplant thing has been observed, but so far I think it's only anecdotal evidence (maybe a bunch of people made stuff up, but so far I'll accept the reports on face value). Not aware of big research going on about it.

    But I won't be surprised if scientists finally find out that your organs (or transplanted organs) can influence what sort of foods/drinks you'd want to consume[1], or even who you want to mate with. It does make some sense from an evolutionary advantage point of view.

    [1] Like fried chicken and beer: http://linkinghub.elsevier.com/retrieve/pii/S1096219000000135 [elsevier.com]

    And if your entire immune system can change after a liver transplant, it means you're not just getting a liver - it's not quite so "neat and clean" as that.

    http://www.dailytelegraph.com.au/news/teen-changes-immune-system/story-e6frf00r-1111115390103 [dailytelegraph.com.au]

    So if the donor's stem cells manage to leak out and help form neurons in the recipient's brain or "stomach brain"[2], why shouldn't there be changes?

    [2] The Enteric Nervous System:

    http://www.psychologytoday.com/articles/199905/our-second-brain-the-stomach [psychologytoday.com]
    http://en.wikipedia.org/wiki/Enteric_nervous_system [wikipedia.org]

    Who is the boss? From the point of view of the ENS, the "central nervous system" (aka brain/CNS) might just be a means to keeping the ENS satisfied.

    ENS to CNS: "Hey CNS go eat a double cheese burger!".
    CNS: "Hmm, I feel like eating a double cheese burger, lets do a lot of complicated stuff like driving, walking etc so that I can eat that".

    Of course the CNS could say, "Must resist, have to stick to diet".

  • Re:don't believe it (Score:5, Interesting)

    by kdemetter ( 965669 ) on Thursday July 23, 2009 @03:23AM (#28792527)
    There is another explanation ( related more to not find something , whether or not they are close ) :

    Often , in stressfull situations , the mind will think the same over and over , rather than thinking about something else.
    It's the reason you keep opening that same closet , even though you look there a hundred times . Then , when you finally give up , your mind is free to think again , and you can remember it again.

    This is because the brain makes various connections to areas in the brain , depending on past expierence.
    For instance , i might have gotten a drink , and then accidentally put my keys in top of the fridge. You might not remember this , until you give up your search , and pour out a drink , which may activate that part of the brain , making you remember.
  • by StripedCow ( 776465 ) on Thursday July 23, 2009 @04:11AM (#28792719)

    But I'm sure you would object against having your brain replaced by a (small) supercomputer, even if I guaranteed that the "observable you" would not change.

  • Depends (Score:3, Interesting)

    by Namarrgon ( 105036 ) on Thursday July 23, 2009 @04:23AM (#28792795) Homepage

    Depends on what you mean by "functionally equivalent". A neural net is a simple self-modifying learning machine, and any detailed simulation of a network of actual neurons like the one TFA describes would certainly qualify.

  • by Dr. Spork ( 142693 ) on Thursday July 23, 2009 @04:37AM (#28792849)
    No, an even sillier fallacy, committed by people who vaguely remember reading about Searle's chinese room thought experiment in Phil 101, is exactly the one you made. Nobody has thought of a good reason why a perfect functional duplicate of your brain wouldn't have the same thought-content as your brain. Of course, if it's made of different stuff, it will need different anesthetics - and you can't get it drunk with ethyl alcohol either. But that's your argument for why it's not intelligent? I hate to break it to you, but your brain is processing signals. That's all it does. I assume you're conscious, but if so, it's because of the signal-processing in your brain.
  • Re:don't believe it (Score:3, Interesting)

    by crazybit ( 918023 ) on Thursday July 23, 2009 @04:45AM (#28792897)
    "Formal scientists" don't even consider Psychology a science, but "an academic and applied discipline involving the systematic, and often scientific, study of human/animal mental functions and behavior". [wikipedia.org]

    Since psychology doesn't comply as "real science", how can "scientists" duplicate the machine that controls most of human behaviour?

    Brain itself operates on the edge of chaos [newscientist.com], it is also the organ that controls the minds of philosophers, musicians, painters, and artists. Computers only emulate "left-side" brain functions - they "take pieces, line them up, and arrange them in a logical order; then it draws conclusions" [web-us.com] - they can beat Einstein on calculus but they can't create art and inventions as Mozart or Da Vinci.

    I believe and embrace science, but I am also aware of it's own limitations. Science, like computers, is merely a tool - good for some jobs but not for all.
  • Re:don't believe it (Score:1, Interesting)

    by Anonymous Coward on Thursday July 23, 2009 @06:45AM (#28793357)

    Hi-Resolution MRI. Just scan someones real brain and then load it onto the computer. We don't even need to know how a 'real' brain works.

    I'm sorry, but "hi resolution MRI" has a resolution of 1.5mm to 2mm. MILLIMETERS. Axons are measured in microns. You need to use electron microscopy in order to obtain a map of neurons.

    Even assuming it was somehow possible to magically get enough hi resolution images to show individuual neurons, we can't just "load it into a computer" -- we couldn't possibly STORE all the images it would require to fully map someone's brain with current technology.

    This IS being worked on, but considering it took many years to map out all the circuit of C. Elegans...it's not going to happen without some radical new imaging technology. Plus, C. Elegans was mapped by HAND. People are working on using machine learning to automate the process, but it's still early work...

    The point is, the IBM guys are always claiming things like this, and the neuroscience community has always been laughing at them. We will certainly be able to build complex artificial brains in 10 years, but they wont be "human" in any meaningful sense. They will just be experimental platforms to study different functional models of neural networks.

  • by spage ( 73271 ) <`moc.egapreiks' `ta' `egaps'> on Thursday July 23, 2009 @07:08AM (#28793449)

    Go re-read Neuromancer to see how all this turns out. Every time you turn the damn artificial brain on it's the same deadpan backseat driver.

    It was disturbing to think of the Flatline as a construct, a hardwired ROM cassette replicating a dead man's skills, obsessions, kneejerk responses. ...

    He slotted some ice, connected the construct, and jacked in.
    It was exactly the sensation of someone reading over his shoulder.
    He coughed. "Dix? McCoy? That you man?" His throat was tight.
    "Hey, bro," said a directionless voice.
    "It's Case, man. Remember?"
    "Miami, joeboy, quick study."
    "What's the last thing you remember before I spoke to you, Dix?"
    "Nothin'."
    "Hang on." He disconnected the construct. The presence was gone. He reconnected it. "Dix? Who am I?"
    "You got me hung, Jack. Who the fuck are you?"

  • Re:10 years? (Score:2, Interesting)

    by Wagoo ( 260866 ) <wagooNO@SPAMdal.net> on Thursday July 23, 2009 @07:21AM (#28793521)

    It's interesting to apply these kind of calculations to the human brain, to understand the scale of the thing.

    From TFA, to simulate a single cortical column:

    "You need one laptop to do all the calculations for one neuron," he said. "So you need ten thousand laptops." Instead, he uses an IBM Blue Gene machine with 10,000 processors.

    Okay, so there's about 20 billion neurons in the neocortex alone, about 2 million cortical columns then assuming 10,000 neurons in each. Even the mighty Moore's law from 2005 (Blue Brain's construction) -> 2019 isn't going to cover an increase of 2 million times for the kind of supercomputer you can construct at that time. So it's already relying on things like GPGPU to supercede Moore's law.

    Storage is another problem. Even simple representations of a neuron, its position, state, and all of its connections get huge when you multiply them by 20 billion.

    Blue Brain is trying to do a chemically accurate simulation of the brain, which as stated could very well be useful for testing new drugs and so on. But I don't expect this kind of heavy simulation to be the first thing to gain conciousness-like properties. We need to use the data generated by Blue Brain to build simpler models of neurons and cortical columns that behave in comparable ways, and then construct our artificial brain from those. Of course connectivity is another issue..

  • Re:don't believe it (Score:3, Interesting)

    by Ichoran ( 106539 ) on Thursday July 23, 2009 @08:20AM (#28793835)

    It is theoretically possible, but the ~1000 cubic centimeter mass of the brain requires approximately 8*10^21 voxels of (5 nm)^3 imaging data just to get the structure--and that misses all of the proteins that are essential to get it to work, and we don't know how to turn those 80000 exabytes into anything useful for computation without going through by hand.

    For the time being, it is "theoretically possible, practically impossible" to do it that way. And it will remain so for longer than ten years.

  • by Ichoran ( 106539 ) on Thursday July 23, 2009 @08:25AM (#28793871)

    I wouldn't say Markram's opinion is in the majority. I know the field well, and I think that either he's sitting on a lot of super-ultra-exciting results that he has mysteriously not presented at the last conferences his team went to, or, more likely, is being hopelessly optimistic (or is confusing "years" and "centuries", or something).

  • by SparkleMotion88 ( 1013083 ) on Thursday July 23, 2009 @10:31AM (#28795113)

    If I was a silicon brain you could just back me up.

    But how does it help you if there happens to be some copy of you somewhere? If you were killed and that copy was restored, would it be you? Or would it just be a copy that resembles you? The scary thing about this question is that to all the observers (including the copy), the copy is you, and no harm has been done, even though the original "you" is dead.

    I often think about this issue in terms of "Star Trek"-style transportation. That is, a person is converted into energy and then energy is then sent somewhere and reconstructed. But that energy represents information, and you could just as easily scan a person and send that information elsewhere to make a copy while leaving the original person in place. So essentially what would happen with "transporting" is that a person is scanned, destroyed, and then re-constructed somewhere else. The re-constructed person has all the memories of the original person, so to him, he was simply "transported." All observers would also say that the person was transported. However, the original person no longer exists. This sort of transporting could happen over and over and nobody would have any evidence that people are being killed.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...