Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Biotech

Ray Kurzweil Does Not Understand the Brain 830

Posted by CmdrTaco
from the i-want-futurist-business-cards dept.
jamie writes "There he goes again, making up nonsense and making ridiculous claims that have no relationship to reality. Ray Kurzweil must be able to spin out a good line of bafflegab, because he seems to have the tech media convinced..."
This discussion has been archived. No new comments can be posted.

Ray Kurzweil Does Not Understand the Brain

Comments Filter:
  • Uh (Score:0, Informative)

    by Anonymous Coward on Tuesday August 17, 2010 @11:20AM (#33277030)

    His actual comments [gizmodo.com]

    Read it. Other than the solid date he predicts, it's pretty plausable.

    1. Technology is growing exponentially
    2. The brain isn't some magical soul-endowed jesus box. It's a function of physics

    Here's how that math works, Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says. Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

    About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

    What's so crazy about that?

  • It would be nice.. (Score:5, Informative)

    by djlemma (1053860) on Tuesday August 17, 2010 @11:20AM (#33277034)
    Would be nice if the summary even hinted at what the ridiculous claim actually WAS...

    Namely, that we'll be able to reverse engineer the human brain in the next 10 years.
  • Re:Uh (Score:2, Informative)

    by haydensdaddy (1719524) on Tuesday August 17, 2010 @11:22AM (#33277062)
    Try reading the rest of the article, not just the pretty quoted stuff...
  • by TheCycoONE (913189) on Tuesday August 17, 2010 @11:35AM (#33277246)

    I haven't read the article yet, but Ray Kurtzweil is a technology speculator - like a sci-fi writer except that he doesn't make up a story to go with his ideas and tries harder to convince people they're actually going to happen. He wrote "The age of intelligent machines" and "The age of spiritual machines" where he takes a hard AI stance that computer thought can become indistinguishable from human thought. He is also a proponent of technological singularity.

    Generally his ideas aren't taken very seriously by academia in Computer Science, or at least that has been my experience. The philosophy department at my university sometimes enjoyed going over his ideas; but the philosophy department at my university was very fond of pseudoscience.

  • Re:Uh (Score:5, Informative)

    by spiffmastercow (1001386) on Tuesday August 17, 2010 @11:46AM (#33277414)
    He actually has one.. And he's a dick, too.
  • If you can program with any programming language without understanding every sublayer beneath it, I don't see why you couldn't do the same with DNA without understanding all the physics and chemistry that makes it work.

    I'm going to go out on a limb here and guess that you're one of the computer scientists with little or no background in neurobiology that Kurzweil has convinced we will magically live forever starting ... now! Listen, unless you're writing science fiction, you should probably stop drawing analogies between two unrelated fields and start reading about our limitations in understanding the human brain.

    Besides, if you read Kurzweil's statement, he's saying we'll reverse engineer the various inputs that can be given to the brain, not the brain it's in entirety.

    We can't even do a brain transplant [wikipedia.org] and you're telling me we just need to reverse engineer the 'various inputs' of the human brain? Are you serious?

    PZ Myers has got it all wrong and jumped to ridiculous conclusions.

    PZ Myers [wikipedia.org] has got it all wrong? Well, he's a professor of biology at the University of Minnesota and has a PhD from the University of Oregon so what credentials do you (or even Kurzweil) hold to be commenting in this manner on the indefinite preservation of the human brain?

  • Re:Sounds reasonable (Score:2, Informative)

    by golden age villain (1607173) on Tuesday August 17, 2010 @11:58AM (#33277580)

    I've never read anything hinting that the way to simulate a human brain would be to simulate how the molecules in the brain behave. [...] From the number on neurons in the human brain, considering how many interconnections there are and how fast the neurons can fire, I think a machine with one million processing cores at 1 GHz would have approximately the same data handling capacity as a human brain. The rest is software. Neural network software is pretty much routine stuff, the tricky part is learning what are the interconnections between the neurons.

    Neurons constitute about half of the brain. About the rest we know pretty much nothing at all (when compared to neurons). What we know at least is that this "rest" plays a role in several aspects of information processing, learning and in all major diseases of the CNS and that it cannot be reduced to spiking neuron models. If it was possible to model the brain as a relatively simple neural network we would have done it 15 years ago and our inability to do so has a lot to do with modeling the behavior of a little number of molecules. I am not an expert programmer but I know enough to say that the complexity of the brain is way beyond what any human being has ever designed.

  • by spongman (182339) on Tuesday August 17, 2010 @12:08PM (#33277692)

    why does the platypus always need explaining?

    it is the sole remaining species in the Genus Ornithorhynchus and the Family Ornithorhynchidae. along with the echidnas (do they need explaining, too?) they make up the Order Monotremata, the egg-laying, web-footed, electrolocating mammals. they evolved, just like the rest of us.

    if there had only been one remaining species of marsupial, would they need explaining?

  • by jonored (862908) on Tuesday August 17, 2010 @12:08PM (#33277694)

    The "3 million base pairs are 6 million bits" isn't because each pair has two parts, it's becuase each pair has four possibilities. 3 million digits in base 4 is equivalent to 6 million digits in base 2.

    For instance, decimal 15 is "33" in base 4 and is "1111" in base 2. You could think of it as one bit for which basepair is at this point in the chain, and one bit for which orientation it's in.

  • Re:Uh (Score:2, Informative)

    by Khazunga (176423) on Tuesday August 17, 2010 @12:09PM (#33277706)

    As a result, the genome alone cannot possibly tell you how to "make" an organism...

    Untrue. The genome gives you the instructions for a self-modifying program that eventually produces a human brain. All biology experiments so far seem to point to the fact that the DNA chain indeed contains all the information needed to build an organism.

  • Re:Uh (Score:1, Informative)

    by Anonymous Coward on Tuesday August 17, 2010 @12:15PM (#33277792)

    Posting in an epic thread!

  • by Animaether (411575) on Tuesday August 17, 2010 @12:36PM (#33278100) Journal

    why does the platypus always need explaining?

    because it's an odd-looking creature that seemingly has 'random' bits and pieces from various other animals... the full question being "what specific selections in 'natural selection' led to this particular evolutionary path?"

    So the question isn't to 'explain' the platypus.. that would be like asking to explain the number 5 or explain the color red.. the question itself doesn't make any sense without being more specific.

    It's also not a question of 'why does the platypus exist?' - natural selection was already the answer, and can even be thought up by people on their own; clearly if it exists, it had some benefit being exactly the way it is within the environment it is in.. if it weren't well-adapted to that environment, it would have died out a long time ago (presuming the environment didn't radically change).

    To be honest, the fully expanded question is actually an interesting one - and one which biologists and others continue to try to answer to more detail to this day. I hadn't actually looked into Platypus info since I was a kid (a school project on its venom, along with other animals' venom), and wiki tells me it was only discovered in 2004 that the Platypus has -10- sex chromosomes, and its genome mapped fully only as recently as 2008. Seems to me there's plenty of questions left.

  • Re:Sounds reasonable (Score:5, Informative)

    by Eponymous Bastard (1143615) on Tuesday August 17, 2010 @12:39PM (#33278158)

    PZ Myers threw a red herring there. What Kurzweil says is pretty reasonable, he used the total amount of information in the genome to get an upper limit estimate of the amount of library code needed to simulate a brain. I say "library" to differentiate from data, since a lot of our brain information comes from our experiences, i.e. library == instincts.

    Actually he's right. The statement is pure bullshit.

    Or maybe that's too much. Kurzweil just doesn't understand how Kolmorogrov complexity works.

    Let's say the brain as a machine is the output of a process. How complicated is that process? The Kolmorogrov complexity of a string (or whatever) is the minimum size of the data that you have to give to a machine in order to produce the string. E.g. a string of 100 0s is simpler than a string of alternating 0s and 1s and simpler than encoding the first 100 digits of pi. Write code for each of those and you'll see the measure works (and it's actually a lower limit, but it's the closest concept...)

    But the crucial point is that the size of this string depends on the kind of machine. The size of the input (program) for a Turing machine is very different than that for an actual computer.

    So, yes. 800MB of code. But that's not the library code. The library that interprets that program is the egg that grows those 800MB of data into a human, together with all the laws of physics and chemistry involved in the process.

    Take all the chromosomes encoding a whole human genome and drop it into a test tube of distilled water. Does it grow a brain? What if you put it into a chicken egg. What grows out? Putting those 800 MB into a computer doesn't do anything if you don't provide the equivalent of the egg. The bootstrap structure and the underlying architecture are as important as the code in understanding the whole system.

    Myers is right. In order to understand the human brain directly from the genes you have to understand all chemistry that interacts with it, all the self replicating machinery provided by the mother and simulate that at a molecular level.

    So the upper bound is NOT 800 MB. It's 800 MB plus the size of a codebase good enough to simulate every interaction at an atomic level plus a full 3D scan at an atomic level of the egg provided by the mother. Or simplified models of all those things, provided by the chemists and biologists out there, as Myers points out. (Plus data equivalent to a few years of training like we do with children)

    Not saying that simulating the brain is necessarilly that hard, it's just that Kurzweil's pseudo-scientific measurement is just bullshit.

  • by Khazunga (176423) on Tuesday August 17, 2010 @12:45PM (#33278238)

    Moore's law, the base of his argument that technology is evolving exponentially is pretty much on schedule. We are now on the Petaflop (10^15) range, with the transistor count following the predicted exponential [readwriteweb.com].

    Cost of DNA sequencing, another of his examples, is today at 0.000008(USD) per base pair [scienceblogs.com]. Fits the curve.

    RAM cost is now at 28000kB/USD, also fitting the curve

    GDP per capita also is within schedule [www.bit.ly] (note that the scale is logarithmic), even with the wealth transfer east (which is bound to be limited in time to ten more years give or take)

    And, lastly, the core of all atacks on Kurzweil, so is life expectancy [www.bit.ly] on track.

    You may still believe these exponentials will hit some kind of ceiling somehow. That might be true. The numbers, however, support Kurzweil's theory. And predicting from the number of times Moore's law depletion was announced in the last twenty years, I'd wager my bets on Kurzweil.

  • Actually, I'd disagree with the "roughly what a newborn can do". By the time a baby is born, it has a non-zero number of neural connections. These are not coded for anywhere in the DNA and the exact dynamics of how they do form isn't clear to me (if it's known at all). A newborn has roughly twice the number of connections than an adult brain, according to some estimates I've seen. Some will die off as new ones form, but the net result is a die-back. There is then a massive construction phase in the brain between the ages of around 7-18, with a second die-back at about 22-24. The adult brain then forms connections at a slower pace. The net result is a massively structured brain, not a loose collection of cells.

    What you'd get if you followed this approach of Kurzweil's is the equivalent of stuffing a few billion zygotes together. Not a terribly useful approach. Well, actually it's worse even than that. Kurzweil's approach involves only the genome. It ignores everything we know about mechanisms external to the genome that control how any specific codon is expressed - of if it even is. We already know that this metadata is capable of selecting which protein any given sequence codes to or whether the sequence is used at all, but Kurzweil has assumed that coding is absolutely 1:1 and that all sequences are always used. Arguably, this metadata is a form of data compression, since in order to get the data in a form for which a 1:1 is always true will involve a substantially longer strand of DNA. I think we can assume that the sum total of all information sources for DNA is a Turing Machine, but it is a Turing Machine with parametrized functions, self-modifying code and highly obscure flow-control. And given how many strands of DNA there are in any given cell, it is also a Turing Machine that is part of a colossal Beowulf Cluster where the node count is comparable to anything in the Top 500 list. Further, nucleic DNA is not the only DNA in cells. Human cells only have nucleic DNA and mitochondrial DNA, but there are other organelles and other cell mechanisms which play a role.

    So, this is not quite the same as using zygotes. Zygotes are more complex than his description as they DO use the metadata and are running massively in parallel at the protein level. Maybe it's closer to throwing a few billion prokaryotes together. Even there, there's still far more complexity than allowed for in Kurzweil's model, but damnit, I can't find any simpler self-replicating form of life! How the hell am I supposed to come up with a perfect analogy when there is nothing organic as remotely as primitive and ultra-simplistic as Kurzweil? (I was going to say Kurzweil's description but then realized that this would be more accurate phrasing.)

  • by phantomfive (622387) on Tuesday August 17, 2010 @01:33PM (#33278968) Journal
    Dude, accepting as you do that the equations of quantum mechanics and general relativity are precise models of the universe (they aren't, but let's leave that for now), it actually would be that simple to simulate the entire universe. The thing that's really going to kill you is the data. You have a universe full of it to simulate. Where are you going to get that kind of RAM? See also [xkcd.com].
  • by WWWWolf (2428) <wwwwolf@iki.fi> on Tuesday August 17, 2010 @01:35PM (#33279006) Homepage

    wouldn't stupid and silly ideas like hard AI and the singularity count as "IT woo"?

    IT is not immune to woo, and indeed, it's good that we actively point out how ridiculous IT woo would be [zapatopi.net]. That link is parody, but on a far more serious note, some years ago, I heard of an actual, real software product that claimed to be able to cast spells. (I was, like, "duhhhhhhh, Perl is free, this product is obviously an attempt to take your money and run.") And don't even get me started about all those shady companies that rebrand free and open source software and sell them as new, awesome software products. (I saw one company that rebranded GIMP as "Real Estate Photo Editor for Realtors" or something silly like that. I wish I was kidding.)

    But seriously...

    Information technology is a tool to facilitate solving real-world problems. It's a mysterious tool in that most people have no idea how it works. As long as mysteries are involved, people can be scammed. It doesn't even have to be a particularly clever ruse, as the rebranded FOSS shows; information technology has a lot of layers, starting from hardware and going all the way to the organisatorial/societal matters, and there's always plenty of mysteries for laymen to figure out.

    Things like hard AI and singularity are more in the "plausible but highly impractical - read: bloody impossible in practice, and any claims of success should be scrutinised heavily" category. On one end of the woo scale we have ridiculous crap like the famous cell phone battery-life extension stickers [slashdot.org]*, on the other end we have heavy-duty stuff like hard AI. It's like a whole spectrum of weirdnesses ranging from funky crap that promises free energy to heavy-duty stuff like cold fusion.

    In short: Yes, IT woo exists. Yes, it's ridiculous.

    * waiiiit... Slashdot discussion that mentions stickers fixing antenna reception issues - in 2005? This is fascinating.

  • by Khazunga (176423) on Tuesday August 17, 2010 @02:12PM (#33279502)

    Test your theory against US data. Infant mortality rate in '80 was 13 per thousand births. In '06 was 6 per thousand births. In a thousand people set, you had seven datapoints that lived zero years (worst scenario case) and now show up as living 80 years (again, worst case scenario). The effect of better infant mortality rates comes down to 80*7/1000 years=7 months (average scenarios produce 5 months). Meanwhile, in the same period, life expectancy went from 74 to 78 years.

    Better infant mortality rates explain 14% of the increase in life expectancy. Where does the rest come from? Better car safety? Perhaps, but certainly with lower effect than infant mortality. The rest? Medical technology.

  • by PopeRatzo (965947) * on Tuesday August 17, 2010 @04:07PM (#33281070) Homepage Journal

    Acupuncture is both too poorly understood to have any real body of scientific knowledge

    Actually, in human trials spanning hundreds of years and hundreds of thousands (if not millions) of subjects, it's been shown to be effective.

    If that's not science, I don't know what is. The mechanism of acupuncture is well understood by the practitioners of it, but the terms in which that mechanism is described are different from the terms that western medicine uses. Further, acupuncture has been proven effective enough for it to be used (and taught) at the best teaching hospitals in the US, including Mass Gen. If you attend a lecture on acupuncture at say, Northwestern Medical School here in Chicago, you'll find that there is no hesitation to use the terms by which the Chinese describe those mechanisms: yin and yang.

    The problem comes in the language that's used. Systemic language, meta-terms like "yin" and "yang" are not the usual language of medical science. It describes entire systems, entire sets of properties, instead of discrete measurements such as "viral load". Because the overlay of western medical terms in relation to these meta-states is not familiar to many people, those people just assume that it "can't be scientific".

    If particle physicists can use "charm" and "strangeness" I don't see why there's such a problem with "yin" and "yang".

  • by HiThere (15173) <{ten.knilhtrae} {ta} {nsxihselrahc}> on Tuesday August 17, 2010 @04:21PM (#33281256)

    It's one of about four or five conventional mechanisms. Generally each of the versions lead into the others, however, so there's no real conflict between them.

    E.g., neural interfaces connected via a development of the Internet could yield an "overmind" capable of addressing problems that cannot now be addressed.

    Or, those same neural interfaces connected to a computer via an advanced programming interface could enable the development of programs not currently possible by using feedback to stabilize one's thoughts while editing. And this could be used to...

    Or molecular biology & genetic engineering could yield more intelligent people who...etc.

    Or ...

    Plug in whatever your favorite technological improvement is, and see how it advances the rate at which things change.

    The singularity happens when the rate of change becomes too fast for us to deny that we can't keep up with it. I'd say it's well underway, but I don't expect it to arrive in full blown form until sometime after 2020, and I project 2030 as when it runs totally out of control (i.e., under it's own control ... or just uncontrolled, depending on exactly what form it takes).

    If we had a government that was ethical and gave serious concern to the quality of the lives of it's citizens, then I'd be doing everything I could to slow down the arrival of the singularity. (Bar forcing civilization to collapse.) As it is, however, if we can manage to live through the transition phase, the singularity may be the only hope of survival for humanity. (My estimate of the odds isn't cheery, hovering around 40%, but my estimate if people stay in control of civilization heads towards zero within a century or two.)

  • by khallow (566160) on Tuesday August 17, 2010 @06:26PM (#33282658)

    Life expectancy continues to go up only because infant mortality goes down. Among those who reach adulthood, life expectancy has barely moved in the past 50 years. Among those who reach elderly age (70+) life expectancy has been nearly constant for all of human history.

    According to this site [infoplease.com], for white males, it's gone up from 10 years life expectancy at age 70 in 1949-1951 to 13 years life expectancy in 2004. In absolute terms, that's not a big difference, but it's a 30% increase in life expectancy. The improvement in life expectancy at age 10 for the same demographic group has gone up from 59 more years to 66 years, an improvement of 7 years. Either figure indicates your assertion is incorrect since the overall life expectancy from birth has gone up from 66 to almost 76 years in that time. So we're seeing a 10 year increase in total life expectancy along with a three year increase in life expectancy for a 70 year old.

  • by Bigjeff5 (1143585) on Tuesday August 17, 2010 @06:29PM (#33282694)

    Just for reference, the GCC compiler is pushing 1.5 million lines of code. Windows XP supposedly had 40 million lines of code.

    Kurzweil is literally saying that the human brain is 2/3 as complex as a C compiler, and 1/40th the complexity of Windows XP.

    Complete lunacy.

  • Except he didn't (Score:3, Informative)

    by Namarrgon (105036) on Tuesday August 17, 2010 @09:18PM (#33284014) Homepage

    Kurzweil didn't make that ridiculous claim in the first place, despite Myers' third-hand assumptions.

    It was just an aside pointing out that the brain's overwhelming complexity all stems from a few million bytes worth of DNA, implying a significant level of replicated structure, andcertainly not a suggestion that we could derive a whole working brain from it.

When you don't know what you are doing, do it neatly.

Working...