Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Books Media Science

Ray Kurzweil's "The Singularity is Near" 970

popo writes "The Wall Street Journal has a (publicly accessible) review of "The Singularity is Near" -- a new book by futurist, Ray Kurzweil. By "Singularity", Kurzweil refers not to a collapsed supernova, but instead to an extraordinarily bright future in which technological progress has leapt by such exponentially large bounds that it will be... well, for lack of a better word: 'utopian'. "Mr. Kurzweil... thinking exponentially, imagines a plausible future, not so far away, with extended life-spans (living to 300 will not be unusual), vastly more powerful computers (imagine more computing power in a head-sized device than exists in all the human brains alive today), other miraculous machines (nanotechnology assemblers that can make most anything out of sunlight and dirt) and, thanks to these technologies, enormous increases in wealth (the average person will be capable of feats, like traveling in space, only available to nation-states today)." On one hand its fantastically (even ridiculously) optimistic, but on the other hand, I sure as hell hope he's right." Got mailed a review copy; I'm not finished yet, but I agree - optimistic perhaps, but the future does look pretty interesting.
This discussion has been archived. No new comments can be posted.

Ray Kurzweil's "The Singularity is Near"

Comments Filter:
  • by Associate ( 317603 ) on Monday October 03, 2005 @10:22AM (#13704310) Homepage
    Things have pretty much sucked up to this point.
    • by MindStalker ( 22827 ) <mindstalker AT gmail DOT com> on Monday October 03, 2005 @10:29AM (#13704401) Journal
      Yes, I'd much rather be plowing fields daily and walking by foot in the snow to the store. Oh yea, things may seem suckey, but only in comparison to your wishes. Trust me, utopia will be sucky too, such is the vastness of human desire.
      • by Itchy Rich ( 818896 ) on Monday October 03, 2005 @10:48AM (#13704644)

        Trust me, utopia will be sucky too, such is the vastness of human desire.

        Exactly. Without a significant shift in human cultures any utopia cannot happen, whether we have the technology or not. We could never agree on what utopia should be like, and would fight about it.

        • by vertinox ( 846076 ) on Monday October 03, 2005 @01:20PM (#13706146)
          We could never agree on what utopia should be like, and would fight about it.

          Not unless that Utopia involves virtualization so that everyone could simulate their own Utopia based on whatever they felt should be the case... Then let the robots deal with the problems of the real world. You know that would make a good movie... Oh wait...
        • You know, it used to be that people who wrote books like these were called science fiction writers. Only they added an intersting story, and usually used concepts they themselves had thought of. And if they did a near-future extrapolation, they usually thought of the good, the bad and the ambiguous (ie everything).

          This guy has just ripped a few idea's from popular sci-fi, penned them down in a 'this will happen' fashion, and is now raking in the bucks. But then again, he is a futurist.

          I'll have to explain t
          • by LnxAddct ( 679316 ) <sgk25@drexel.edu> on Monday October 03, 2005 @02:55PM (#13706944)
            Ray Kurzweil is a reknowned "futurist" who has accurately predicted the future literally hundreds of times. He sometimes is even responsible for it happenning, i.e. he created the first synthetic instruments, first electronic book reader for the blind, the first robot that creates truly original art, a robot that writes poems inspired by other poems ( from what I understand, he really just uses an elegant markov chain), and he is currently one of the industry leaders in Artificial Intelligence research. He owns like 12 corporations and is a millionaire not because he is a crazy lunatic, but because he is often accurate and good at what he does. In addition to the above, he is often paid hefty sums of money to do consulting at Lockheed Martin and some other major companies. This guy is no joke, take what he says seriosuly.
            Regards,
            Steve
      • Johnny Cash: "Satisfied Mind":

        How many times have
        You heard someone say
        If I had his money
        I could do things my way

        But little they know
        That it's so hard to find
        One rich man in ten
        With a satisfied mind


        (this is what Bud was listening to in his trailer in Kill Bill 2)

        WTF do people WANT? There are people who all they do is go to parties all day, being chauffered around and catered to at every turn who are MISERABLE. Conversely, there are people who literally shovel shit all day who are happy as clams. Jesus H. C
        • Well, it's just my bullshit theory, but I'd say that the shit-shoveler is happier because he has simple, physical challenges often met and bested. The modern well-to-do life gains a sense of ennui and purposelessness because the inadequately-evolved human animal is still freaking out over needing food, shelter, and crushing competition, even if everything is better than fine. Without real challenges to stimulate and satiate the hunting urge, petty trifles fill in the space with just as much gravity.
      • by Poeir ( 637508 ) <poeir.geoNO@SPAMyahoo.com> on Monday October 03, 2005 @12:25PM (#13705650) Journal
        You mean something like this [partiallyclips.com]?
      • Time for a quick review:

        The Technology: Nuclear Power

        The Promise: Cheap, clean, safe, plentiful electric power.

        The Reality: Expensive power with waste we don't know how to deal with, but it does have the added bonus of creating by-products that can be turned into horrible weapons of mass destruction.

        The Technology: Robots

        The Promise: Sit back in your easy chair and let Robby the Robot mow the lawn and take out the trash while you relax and have a beer.

        The Reality: Sit back in the unemployment li

      • by hernyo ( 770695 ) <laszlo.hermann@gmail.com> on Monday October 03, 2005 @01:08PM (#13706029)
        You are damn right. So much technological progress but we know almost nothing about our brain. We understand how 20 million transistors work together to form a computer but we do not have any idea what makes us love or hate each other.

        It is only the environment that changed in the past 100 years not peoples lives. Just take the following basic stuff: love, work, power, friendship, kids, getting old, etc. Now people have the same problems, or similar problems in different context. Technology does not really change our lives, it changes only the circumstances.

        Just like hoping to get your favourite pancake in a nicer packaging next year. Lets say easier to open box, instant delivery... is there anyone out there believing this would mean a significant change in his life? Yes, unfortunately...
    • by LWATCDR ( 28044 ) on Monday October 03, 2005 @10:36AM (#13704508) Homepage Journal
      " Things have pretty much sucked up to this point."
      Yea we still have thousands of children with Polio in Ironlungs...
      Actually the world is a pretty good place in most developed countries. It is even a lot better than it was 50 years ago in the developing countries.
      The correct way to look at it is not that the present sucks, but how can we make the future better.
      • by isomeme ( 177414 ) <cdberry@gmail.com> on Monday October 03, 2005 @11:29AM (#13705073) Journal
        As William Gibson remarked (quoting from memory), "The future is here. It's just not evenly distributed."
    • Grim Meathook Future (Score:4, Informative)

      by spoonyfork ( 23307 ) <<moc.liamg> <ta> <krofynoops>> on Monday October 03, 2005 @03:24PM (#13707176) Journal
      For 99% of the world the future is only going to get worse. Read about the Grim Meathook Future [zenarchery.com] foretold by yet another "futurist".

  • by Hulkster ( 722642 ) on Monday October 03, 2005 @10:22AM (#13704315) Homepage
    The writer of the WSJ piece was Glenn Reynolds who is identified as "a professor of law at the University of Tennessee but is probably better know for his InstaPundit.Com Blog. [instapundit.com] Interesting piece - Glenn has been published numerous times in the WSJ and (staying out of politics because people get overly zealous about this), writes some darn good stuff IMHO.

    HULK's Halloween decorations webcam is up! [komar.org]

  • Optimisim sells... (Score:5, Insightful)

    by ankarbass ( 882629 ) on Monday October 03, 2005 @10:22AM (#13704323)
    You can sell more copies of a book that talks about how we will all be rich and immortal than you can of one that predicts more of the same.
    • by isomeme ( 177414 ) <cdberry@gmail.com> on Monday October 03, 2005 @10:38AM (#13704532) Journal
      I don't know; Jared Diamond seems to be selling a lot of copies of Collapse.
    • by archen ( 447353 )
      Which brings up the point, do you really WANT to live 300 years? We already tend to go downhill after our 20's, and each decade after is compounded by more health problems. Now some people will claim that uber-nano technology, and some franken-science will keep us in great shape, but simply put; every part in our body wears out with time.

      We seriously can live pretty long as it is. If you can't live it up in the first ~70 years, you're probably not going to get more out of the next 230. Not to meantion
      • We already tend to go downhill after our 20's

        More like your teens, buddy. GH levels peak and flatten during your teens, and the decline in GH levels in general corresponds to how hard and fast you live. Vigorous athletes tend to have a later, longer peak and flattening period. But by high school commencement, it's all downhill buddy.

        Evolution only protects you until you can make babies, then you're on your own.
      • by DrLex ( 811382 ) on Monday October 03, 2005 @11:32AM (#13705113) Homepage
        Apparently, many people want to reach or approach a state of immortality. I can understand why, but if it's in the sense of extending human life as it is now to an unlimited life span, I'll pass on it. I bet that this desire of becoming some immortal human being is mostly rooted in egoism, which causes most people to assume that the rest of the world will stay mortal when they become immortal. Which will, of course, be true to some extent since none of the less developed countries will be able to profit from whatever technology makes immortality possible.
        But eventually, the world (be it earth or all planets we might make habitable) will be filled with immortal people, unable to procreate because there is no more room nor resources for more people. They will be doomed to either continue living with the same people eternally, kill each other, or commit suicide. No thanks.
        • by Golias ( 176380 ) on Monday October 03, 2005 @12:20PM (#13705602)
          Apparently, many people want to reach or approach a state of immortality. I can understand why, but if it's in the sense of extending human life as it is now to an unlimited life span, I'll pass on it. I bet that this desire of becoming some immortal human being is mostly rooted in egoism, which causes most people to assume that the rest of the world will stay mortal when they become immortal. Which will, of course, be true to some extent since none of the less developed countries will be able to profit from whatever technology makes immortality possible.
          But eventually, the world (be it earth or all planets we might make habitable) will be filled with immortal people, unable to procreate because there is no more room nor resources for more people. They will be doomed to either continue living with the same people eternally, kill each other, or commit suicide. No thanks.


          We will miss you.

          What's wrong with having the same people around "eternally"?

          There's... what? Six billion of us? Even if you figure that more than half of those people are assholes, that's still almost three billion people worth having as friends. It would take a long time to get acquainted with them all (and sift them out from said assholes.) Just learning all the languages we would need to learn to all talk to each other fluently would take one or two of what we used to consider lifetimes.
      • by The Lynxpro ( 657990 ) <lynxpro@NOsPam.gmail.com> on Monday October 03, 2005 @11:40AM (#13705191)
        "And as a side point, the world progresses by generations. The additude and bias of the last generation is replaced by the fresh more adapted views of the next generation. As a whole, humanity grows by death of the old, and birth of the new. Think your government representitives are bad now, then think of what would happen if a guy who was born in 1750 was making the decisions on stuff like the Internet"

        Would you rather live in a *Logan's Run* civilization where you have to be "renued" at the ripe age of 30? (yes, I realize the age was lower in the book).

        And oh my....the tyranny to live under the rule of someone who has lived a long time. Seems like that's what we tolerate today here in the U.S. under the Constitution.

        I also think there are several figures from the 18th Century that could easily function in the 21st (and later) and our society would be better if they still lived. I'm thinking about Ben Franklin and Voltaire in particular.

        Militarily, just imagine if the military minds of Julius Caesar, Alexander and Cromwell held commanded in today's battlefields.

        Your post really discredits people from the past and cheapens their individual contributions.

        • by sedyn ( 880034 ) on Monday October 03, 2005 @12:29PM (#13705675)
          "Militarily, just imagine if the military minds of Julius Caesar, Alexander and Cromwell held commanded in today's battlefields."

          Code-wise, picture if the old COBOL programmers today were kept in the workforce for another dozen decades. I think it's a shame that a langauge as old as my father is still being used by my father at his age. Likewise, if I'm using C++ when I'm nearing 50.

          Old -> legacy -> entrenchment... The only escape is when cost(refractoring_to_new) cost(maintaining_old)... Which is starting to happen in the case of COBOL due to the aging of that generation...

          Not to say that old things are bad, it's just that they typically were solutions for their day. Picture this, one day (probably within our lifetimes), people might look at Java as an efficient language. It sounds kinda funny to us. But go back 30 years and tell an assembly programmer that C is efficient.
  • Semi-topical link. (Score:3, Insightful)

    by TripMaster Monkey ( 862126 ) * on Monday October 03, 2005 @10:24AM (#13704341)

    For those of you who enjoy fiction, Accelerando [accelerando.org] by Charles Stross [antipope.org] is one of the best fictional treatments of the Singularity I've had the pleasure of reading. In Accelerando one of the characters refers to the Singularity as the 'rapture of the nerds'. Great stuff.

    Seriously, though, will we be able to actually pinpoint a time and say 'this is when the Singularity occurred'? I'm sure that a person from the 19th century, when confronted with the complexity of life today, would contend that the Singularity has already happened, but this time is still (largely) comprehensible to us. As time marches on, and things become steadily more complex, won't humans, augmented by increasing levels of technology, maintain at least a cursory connection?
    • There are several different types of Singularities postulated by the various SF authors who have been involved in popularizing the term over the last few decades. In Vinge's original Singularity, in Marooned in Realtime, the entire human race (minus a few people in stasis bubbles) simply vanished--uploaded, transcended, no one knew. In Stross' novels, the main marker is usually the awakening of a superhuman AI.
    • by meringuoid ( 568297 ) on Monday October 03, 2005 @10:41AM (#13704568)
      Seriously, though, will we be able to actually pinpoint a time and say 'this is when the Singularity occurred'?

      I shouldn't think so. Whenever singularities appear in any model of the real world, it generally means a breakdown of the model. So this singularity means an acceleration of technological advance to a point where our ability to forecast breaks down and we really can't say what will happen.

      A singularity would have it that we get ever-accelerating advance, heading skyward to infinity at some finite time. I dislike, therefore, forecasts that the singularity will bring utopia. It need not. The singularity could very easily bring extinction. It could bring hell on earth. It could bring a tyranny beyond the dreams of 1984, in which no proletarian revolt could ever succeed because we've all got Seven Minute Specials waiting to go off inside us. To be quite honest, I think our best hope is extinction, but leaving successors - which is, let's face it, the best hope of any species that there ever was. In addition, I don't mind whether this means our genetically enhanced, cybernetic, hyperevolved biological descendants, or our superintelligent quantum-computing AI offspring. What do I care about DNA, after all? A sentient robot I might build is as much my offspring as a human child I might father.

      I agree with the concept of the singularity - there are advances coming whose impact on society we won't be able to predict until it happens - but not that it will necessarily be good.

    • by braindead ( 33893 )
      Actually my favorite singularity fiction comes from Vernor Vinge. I think he actually came up with the singularity idea [caltech.edu] - the link goes to a 1993 talk in which he presents the idea.

      I don't know whether we'll reach that singularity he talks about, but I really enjoy his books, for example the early True Names [earth.li], or more recent books such as A deepness in the sky [caltech.edu] or A fire upon the deep [caltech.edu]. These last two are my two favorite science fiction books.

      And, no, I'm not affiliated with V. Vinge.
  • Technology (Score:3, Insightful)

    by mysqlrocks ( 783488 ) on Monday October 03, 2005 @10:24AM (#13704344) Homepage Journal
    Yes, but all of his wonderful technology could be used by people that want to preserve their own power and wealth. Why does he assume that it will be used for "good" purposes? Look at nuclear energy, for example. It's a powerful source of energy but the same technology is used to make nuclear weapons.
    • Let them try! (Score:3, Informative)

      by mangu ( 126918 )
      this wonderful technology could be used by people that want to preserve their own power and wealth

      Hmmm, like digital recording technology could be used by people who want to preserve their "intellectual property"? Just wait for the nanotech napsters and emules. When there's a "Mr. Nanoassembler" in every kitchen the concept of wealth itself will be changed to something we cannot understand today.

      Why does he assume that it will be used for "good" purposes?

      I haven't read Kurtzweil's book, but from TFA it se

  • by dancingmad ( 128588 ) on Monday October 03, 2005 @10:25AM (#13704353)
    Where is my flying car.

    Get on it. I was promised one more than 50 years ago.
  • by bigtallmofo ( 695287 ) on Monday October 03, 2005 @10:26AM (#13704367)
    extraordinarily bright future in which technological progress has leapt

    This really sounds like one of those "In the year 2000, people will be..." If this type of thing were remotely true, I'd be driving a hover car to work right now. And yes, I know they exist but I don't know a single person that has even the remotest possibility of owning one.

    I guess you have to come up with this kind of thing to sell books or articles. I would imagine nobody would be buying a book envisioning the year 2025 as pretty much the same as today with more hard disk space and faster CPUs.

    • Rewind your brain 15 years and imagine what you'd think if I told you:

      Your computer will be roughly 1,000 faster than what you're using today. You will probably have more than 4,000 times the memory, and a fast hard drive that stores over 100,000 times as much as that floppy you're using. You can buy these supercomputers for less than $500 at Wal-Mart.

      That computer will be hooked into a self-directed network that was designed by the Department of Defense and various universities - along with nearly 400,000,000 other machines. Your connection to this network will be 10,000 times faster than the 300 baud modem you're using. In fact, it will be fast enough to download high-quality sound and video files in better than realtime.

      There will be a good chance that your computer's operating system will have been written by a global team of volunteers, some of them paid by their employers to implement specific parts. Free copies of this system will be available for download over the hyperfast network. You will have free access to the tools required to make your own changes, should you want to.

      You will use this mind-bendingly powerful system to view corporate sponsored, community driven messages boards where people will bitch about having to drive cars that are almost unimaginably luxurious compared to what you have today.

      Remember: in some fields, the singularity has already happened.

      • The Intellectual Property/information singularity is happening right now, if anyone cares. At this exact moment. From now on, there will be pre-2000 knowledge, and post-2015 knowledge, or whenever this ends.

        Not just copyrights, either. Patents are getting a shake-down, and remember when people had trademarks instead of google rankings?

        Remember when there were corporations dedicated to providing 'news'? Remember when people who uncovered some secret, global spanning government conspiracy would race to mail it to a trusted person, or a newpaper reporter, and hope they didn't end up dead, instead of just posting it on the net and everyone knowing about it one hundred and twenty seconds later when their RSS feeds updated?

        Remember when there was a lot of information out there, like mapping phone numbers to addresses or the location of secret government installations in the middle of nowhere, and it was hard to find? Remember that? When we knew information existed, yet couldn't immediately find it?

        There used to be buildings you could go to to find out who was the king of England in 1293, and what the capital of Chad is, and who pitched the first recorded no-hitter in MLB. (Edward I, N'Djamena, and Nixey Callahan, which I looked up in less than one minute.) I think they were called 'liberbies' or something. Rememeber when you used to have to go to them?

        If any industry starts spinning wildly for no apparent reason, with pieces flying off left and right, it's probably in the middle of a singularity.

      • by JohnPM ( 163131 ) on Monday October 03, 2005 @12:33PM (#13705705) Homepage
        I agreed with most of your post, except:

        Remember: in some fields, the singularity has already happened.

        The point of the singularity idea is that advancement is going to get so fast that we can't keep track of it, control it or predict what life will be like afterwards. None of that has been true about computing yet.
        • I'd have to disagree: I don't think anyone really knows what is going on in all the various fields of computing, the field has gotten too big. Likewise, 10 year predictions about computing made 10 years ago were drastically off. If you can't predict 10 years, I call that pretty unpredictable. And control it? Not even china, the most authoritarian regime around can control what people are doing with computers in their country.
  • by manonthemoon ( 537690 ) on Monday October 03, 2005 @10:27AM (#13704371) Homepage
    aren't with the technology. We have "utopian" level technology compared to 80 years ago right now. The problem is with the people.

    Look at Russia. Rampant alcholism, suicide, murder, gansterism, etc. Yet it is perfectly capable of sending off spaceships and creating high level technology.

    I appreciate and welcome all the anticpated advances- but unless we create a worldwide civil society that is robust, honest, and representative; it won't make a dime's worth of difference.
    • by cowscows ( 103644 ) on Monday October 03, 2005 @10:45AM (#13704612) Journal
      There's certainly social and political problems keeping a lot of people from improving their quality of life, but I think the whole point of the singularity is that technology will eventually reach a point where there's just no" good" reason for everyone not to be involved.

      We've got lots of really cool stuff now, but much of our economy is still based on scarcity. Energy is not free, and the people who control the methods of production have a lot of influence. And so they want to keep it that way. The same thing is true of many raw materials.

      But even more than that, there's the labor issue. I don't think anybody's personal utopia involves spending all day out in the sun building roads, but we require that a whole lot of people do that, and other crappy jobs, because it's the only way we have to get it done. The fact that some jobs are crappier than others creates some weird social layering. If there comes a point in the future where we could have machinery efficiently do all those jobs, then things can probably change.

      But yeah, it won't be easy, it won't just magically happen because of any particular invention. But technology will continue to make it more likely.
  • Huh? (Score:3, Insightful)

    by Otter ( 3800 ) on Monday October 03, 2005 @10:27AM (#13704379) Journal
    Naturally, Mr. Kurzweil has little time for techno-skeptics like the Nobel Prize-winning chemist Richard Smalley, who in September 2001 published a notorious piece in Scientific American debunking the claims of nanotechnologists, in particular the possibility of nano-robots (nanobots) capable of assembling molecules and substances to order. Mr. Kurzweil's arguments countering Dr. Smalley and his allies are a pleasure to read -- Mr. Kurzweil clearly thinks that nanobots are possible -- but in truth he is fighting a battle that is already won.

    The battle has been "won" in that "nanotechnology" has been repackaged to refer to "really small stuff", rather than to Drexlerian nano-assemblers. I'd be interested in reading what Kurzweil says (although I give the benefit of the doubt to chemists with empirical data over "futurists") but it's not like anyone has successfully demonstrated anything approaching Diamond Age proportions.

  • by imsabbel ( 611519 ) on Monday October 03, 2005 @10:27AM (#13704380)
    "singularity" says nothing about "bright future" or "utopia" per sé, but more descripes a point where the ever increasing innovation rate makes predictions impossible.
  • by L. VeGas ( 580015 ) on Monday October 03, 2005 @10:28AM (#13704381) Homepage Journal
    Pessimist: "That glass is half empty."
    Optimist: "That glass is half full."
    Kurzweil: "The self-cloning milk in that glass will replicate thanks to nanobots and end world hunger."
  • by MosesJones ( 55544 ) on Monday October 03, 2005 @10:28AM (#13704387) Homepage
    Iain M Banks (to be confused with the non-sci-fi writer Iain Banks) has written a lot of book about "The Culture" a man/machine symbiosis that has created a utopian society in which people get what they need.

    Actually it sounds also like Robert Heinlein, Asimov and most other Sci-Fi writers I've ever read. But mostly like Iain M Banks who books are a cracking read.

    Living to 300... of course we will, we'll have to work till we are 280 though.
    • Side note (Score:3, Informative)

      by edremy ( 36408 )
      You do know that Iain Banks and Iain M. Banks are the same person, right?

      He uses the M for his SF stuff and drops it for his more mainstream fiction.

    • Iain M Banks (to be confused with the non-sci-fi writer Iain Banks)

      I presume that:

      1. You meant "not to be confused with ..."
      2. You understand that Iain Banks and Iain M. Banks are the same writer, and that he uses the different names to differentiate his works.

      I agree 100% about your assessment of his writing, both SF and 'social'. Try out "Whit" or "The Business" for some really well told tales that don't feature exploding planets.

  • by Anonymous Coward on Monday October 03, 2005 @10:29AM (#13704410)
    From what I hear, the "peak oil" [peakoil.org] crisis stands a decent chance of obliterating human society as we know it before any of this wonderful stuff can happen. I would love it if someone would make a good argument why this isn't the case, but I've yet to hear one.
    • Most of the people who disagree with the peak oil thesis tend to rely on "the market" being the holy savior. While it may be true that "the market" will help out, the thesis states that by the time the market does self-correct, many, many people will be thrust out of the middle class due to increasing debt (based, in part, on the housing bubble) with some serious inflation due to higher energy costs.

      Others cling towards fission as a way to generate energy until the renewables (including fusion) are up to s
    • We will never run out of oil because we'll switch to something else first. What will that be? I have no idea but I know that it will be better than oil, just as oil has been better than horsepower. Why am I so confident? Because we managed to make the transition from horses to cars without the end of the world happening. I'll bet that if you go do the research, you'll find predictions of how human society would crash because there simply wasn't enough space to grow the hay to feed all the horses needed
      • You're right that we probably won't run out of oil... but supply is no longer meeting demand and it'll never again catch up, which is what Hubbert's Peak is all about. There's a lot of oil left in the ground, but developing it keeps getting more expensive and the rate of development can't keep pace with the demand. The result is that prices are going to keep going higher /until/ we are sufficiently far along in the process of "switch[ing] to something else". And that process has barely started and thus i
    • by Surt ( 22457 ) on Monday October 03, 2005 @12:26PM (#13705654) Homepage Journal
      Here's a short argument: gas prices tripled in the last 5 years, but society didn't collapse. As prices rise higher and higher, people will push and invest more and more in oil alternatives. Already there are at least 4 major oil alternatives which could power our society within 5 years if we were sufficiently desperate: solar, wind, fission, fusion. We also aren't making a lot of one time investments at a rapid rate, which we could if we got desperate enough, such as replacing all of our lighting with LEDs, and replacing older energy gobbling computers.

      The bottom line is that we're working on efficiency and cost improvements to all of these technologies and making a gradual transition over to using them. If the oil situation gets serious, we'll accelerate our conversion.
  • by jamie ( 78724 ) <jamie@slashdot.org> on Monday October 03, 2005 @10:32AM (#13704453) Journal
    Ray Kurzweil is dead wrong. I respect his work but his impossibly optimistic projections are misleading. Here's one numerical example. Kurzweil has claimed [kurzweilai.net] "human life expectancy" was increasing by "150 days, every year," and that shortly, increases in life expectancy would be beating Nature in the footrace:

    with the revolutions coming in genomics, perdiomics, therapeutic cloning, rational drug design, and the other biotechnology revolutions, within 10 years we'll be adding more than a year, every year, to human life expectancy. So, if you can hang in there for another 10 years, (don't spend all of your time in the French Quarter!), this will be the increase in human life expectancy. We'll get ahead of the power curve and be adding more than a year every year, within a decade.

    The accompanying graph is staggering but only shows five points of data. Its top point shows a life expectancy of 77 years in 1999 or so, which of course is not human life expectancy. Human life expectancy is about 65, ranging from about 43 in poor countries to 79 in the richest country. Kurzweil's statement only applies to the wealthy; in much of Africa, life expectancy fell dramatically during the 1990s.

    And since he's clearly talking about life extension, the reader should be aware that there is no exponential curve at the top of the lifespan. His numbers gained mostly from improvements in child nutrition and antibiotics, and there aren't any continued improvements to be made in those (quite the opposite, actually). If we look at the average continued life expectancy for Americans aged 75, between 1980 and 1985 they gained 0.2 years; 1985-1990, 0.3 years; 1990-1995, 0.1 years; 1995-2000, 0.4 years; 1997-2002, 0.3 years. This is good. But it's not exponential lengthening of lifespan.

    Oh, and the "decade" within which he promised we'd be ahead of the curve is now half over. The above quote is from 2000.

    The main logical error Kurzweil makes is simply that he thinks computers will get smarter because they get faster. Readers who believe the one has anything to do with the other need to go back to Dreyfus' 1972 classic What Computers Can't Do. From there, start reading over the painful history of what is now called "strong A.I.", and what used to be just called "A.I.", to see how necessarily limited our efforts have become. Kurzweil elides over this distinction in the worst way. He starts by saying that computers are now as smart as an insect -- which is unrefutable because nobody can quantify what that means -- and proceeds to predicting that they will be as smart as people once they get n times faster. No, I'm sorry, all that means is that they will be as smart as n insects. Whatever the hell that means.

    Mostly I wouldn't care. Fantasy is fun. Except that Pollyannaish predictions of paradise-yet-to-come persuade people that the problems we create for ourselves are irrelevant. If you think the Rapture or the Singularity is going to make all currently conceivable problems laughable, little things like massive extinction and global warming turn into somebody else's problem. They're not -- and our grandchildren, with their very fast and non-sentient computers, and their non-300-year lifespans, are going to be kind of ticked that you and I spoiled the planet.

    • Two other flaws in Kurzweil's claims:
      First, his view of the socio-technical aspects of technological innovation are entirely one-way. I've not seen him address the problem that such advances in power are only sustainable as long as there is a market for them. In the case of many other technologies such as the internal combustion engine, cooling systems, and aviation, advances in power and capacity tapered off due to a lack of a strong market demand.

      The second flaw hinted by the first, he seems to play fas
  • Needs (Score:5, Insightful)

    by cowscows ( 103644 ) on Monday October 03, 2005 @10:33AM (#13704463) Journal
    The whole premise is actually kind of simple I think. There's three basic components to everything that we use in our lives. Raw materials, Energy, and Design. Stuff needs to be thought up(design), it requires ingredients to build(raw materials), and it takes energy to make/use/operate it. Some things, like digital media, have negligible raw material requirements, but they still fit the mold.

      So if we can make computers that can actually think well enough to do the design, then getting design done faster just requires better computers. I think it's safe to assume that computers will continue to increase in power. Whether or not they'll become "intelligent" is harder to predict, but lets say for the sake of the singularity that they do.

    We also need plentiful energy. If this whole fusion power thing ever pans out, we'll have that.

    Raw materials are a little harder. Making things just out of dirt is a bit simplistic. because there's lots of different minerals and such present in dirt, and they're not all suitable for any purpose. There's lots of stuff available in the earth, but extracting it, even if it becomes easy, will most likely be rather destructive. The solution is to make spaceflight reliable enough that we can mine other places, asteroids and the like.

    Although that seems to me to be a short term solution, because most things in space are pretty far away. Unless there's some sort of major star trek-ish breakthrough in propulsion, it's never going to be all that simple.

    I guess the point is, design and energy are almost like a switch. Either we'll have a couple big breakthroughs that'll bust those two wide open, or we won't. But even if we got cheap brains and cheap energy, the raw materials issue seems like it'd be a harder problem. If you're looking for a long term investment, land would probably be a good one, because it's the hardest thing for us to make more of.
  • Not "Utopia" (Score:3, Informative)

    by magarity ( 164372 ) on Monday October 03, 2005 @10:38AM (#13704525)
    an extraordinarily bright future in which technological progress has leapt by such exponentially large bounds that it will be... well, for lack of a better word: 'utopian'.
     
    More's Utopia was a vision of a place where Marxist Socialism actually worked. It had nothing to do with technological progress.
  • Living to 300 ... (Score:3, Interesting)

    by ta ma de ( 851887 ) <chris.erik.barnes@ g m ail.com> on Monday October 03, 2005 @10:41AM (#13704573)
    Why stop there, fuck 300. How about we don't have to die. Why wouldn't the same chemical modifications that would allow for a 300 hundred year lifespan continue to work forever?
  • by Bob Hearn ( 61879 ) on Monday October 03, 2005 @10:50AM (#13704665) Homepage
    As described, this sounds just like the singularity Vinge always writes about. I hope he gets credit. I do think there's some sort of singularity coming, but I'm less sure than Kurzweil that we can predict much of what will be on *this* side of it, let alone on the other side.

    BTW, for those who (like me) had always pronounced "Vinge" to rhyme with "hinge", according to Vinge himself it rhymes with "dingy".
  • by ausoleil ( 322752 ) on Monday October 03, 2005 @10:58AM (#13704751) Homepage
    One only has to go back through ancient issues of Popular Science or Life Magazines to read through promised Utopia through technology. Flying cars, personal atomic power plants, smart homes, etc., were the rule of the day back then, and they all had fleetingly brilliant promise to bring a new "wealth" of leisure.

    It didn't happen.

    Fast forward to the 1970's at the advent of the personal computer revolution and read magazines like "Byte" or similiar. The coming of age of the PC was to free us from mundane tasks, make work easier, give us more leisure time because things were simpler.

    That did not happen either, even if Byte and others were correct in saying that the computer revolution was here to stay.

    There is a truism in regards to technology: when something is made easier to do, more of it is expected to be done.

    Or, if you prefer, back to the PC analogy: PC's have made things like spreadsheets, memos, etc., far easier for the average office worker, but instead of being rewarded with more leisure time, more spreadsheets and memos etc. are expected instead. In other words, instead of making life easier, more work has been created and now we are more or less enslaved to the technology that it is done on.

    History is rife with examples of this: cellphones, for example. Now you cannot get away and work goes with you everywhere, all too often 24/7. Enslaved to the never-ending communication, instead of better, we got more.

    George Santayna said those who ignore history are condemned to repeat it. True. And history here will repeat itself. Technology will make things easier, and when they are easier it will be expected that more of it will be done.

    And, as anyone who has sat on a beach with only a cool drink and the waves to contemplate, more work, no matter how "easy" is not Utopia.

    • There is a truism in regards to technology: when something is made easier to do, more of it is expected to be done.

      And what happens when it is affordable to just manufacture robots and AIs to do the work? And manufacture robots and AIs to manufacture and design robots and AI? We could get to a point where it is vastly more efficient to manufacture "workers" than to train humans to do the work.

      I don't know what happens then. But it certainly isn't just "more of the same". An observer could not have pre

    • by Bastian ( 66383 ) on Monday October 03, 2005 @11:39AM (#13705180)
      Enslaved is a bit harsh of a term.

      We aren't enslaved by our technology or our employers. We're enslaved by our own shallow, greedy, workaholic culture.

      Our employers call us at home and have us bring our work home on company-provided laptops because we, as a society, let them do it.

      Nay, we ask for it. Our obsessive need to have everything we buy cost less is what forces companies to start forcing us to do things like working unpaid overtime.

      We're enslaving ourselves for valuing TVs that we don't have the time to watch and luxury cars that we will love for a week and then spend the rest of our lives associating with the two hours' worth of heavy traffic that we use them to experience every day. You're not a victim of the march of technology, you're not even a victim of your boss (remember, you agreed to take the job). You're just a victim of rampant materialism.

      Think I'm just being some sort of hippie idealist? Well, chew on this: lately studies have been consitently showing that, once you get past the poverty line, personal satisfaction and happiness are negatively correlated with income.
    • On the other hand, more work in less time equals greater productivity. When everyone gets more done, there is more to go around. If a road worker can lay road in half the time, and is expected to do twice as much, we all benefit by receiving more roads for a lower price. This frees up more money in our budgets to spend on vacations to tropical island beaches. In all seriousness, look at the fraction of the population which can reasonably afford a tropical island vacation at least once in their lifetimes
  • by Eliezer Yudkowsky ( 124932 ) on Monday October 03, 2005 @11:20AM (#13704984) Homepage
    If you people would RTFB, you'd discover that the Singularity has a history of intellectual discussion going back around two decades. The treatments in science fiction are a part of that, but just reading the SF isn't going to get you much (any more than reading SF will teach you physics, or math, though it might serve to get you interested).

    http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-s ing.html [caltech.edu]
    http://singinst.org/what-singularity.html [singinst.org]
    http://www.accelerationwatch.com/ [accelerationwatch.com]

    And let's not forget:

    http://justfuckinggoogleit.com/search.pl?query=Sin gularity [justfuckinggoogleit.com]
    http://en.wikipedia.org/wiki/Technological_singula rity [wikipedia.org]

    The first person to use the term "Singularity" as applied to futurism was John von Neumann, and he used it to mean a disruptive change in the future brought about by a high level of technology.

    The first person to postulate that recursive self-improvement in Artificial Intelligence would rapidly produce "ultraintelligent machines" was the Bayesian statistician I. J. Good. Today this is known as the "hard takeoff" scenario.

    The first person to popularize the term "Singularity", referring to the breakdown in our model of the future which occurs subsequent to the (technological) creation of smarter-than-human intelligence, was the mathematician (and sometime SF author, and inventor of cyberspace) Vernor Vinge.

    Kurzweil's "Singularity" belongs to the accelerating change crowd that includes John Smart. Their thesis is, first, that history shows a trend for major transitions to happen in shorter and shorter times, and second, that you can graph this on log charts, get reasonably straight lines, and extend the lines to produce useful quantitative predictions. I agree with the qualitative thesis but not the quantitative thesis.

    In my opinion, Kurzweil could greatly strengthen many of his arguments by giving up on the attempt to predict when these things will occur, and just saying: "They will happen eventually." I think that it is just as important, and a great deal more probable, to say: "Eventually we will be able to create Artificial Intelligence surpassing human intelligence, and then XYZ will happen, so we better get ABC done first." Than to say: "And this will all happen on October 15th, 2022, between 7 and 7:30 in the morning."

    Since I don't care particularly about when someone builds a smarter-than-human intelligence, just what happens after that, and what we need to get done before; and since I don't think that this necessarily needs to make life incomprehensible, so long as we do things right; I belong to the I.J. Good "hard takeoff" crowd. With a strong helping of Vernor Vinge, because I think there's a difference in kind associated with a future that contains mind smarter than human, which we do not get just from talking about flying cars, or space travel, or even nanotechnology.

    On Slashdot, someone says "intelligence" and you think of all the computer CEOs with IQs of 120 and the starving professors with IQs of 160, and you think that means intelligence isn't important. But you will not find many excellent CEOs, nor professors, nor soldiers, nor artists, nor musicians, nor rationalists, nor scientists, who are chimpanzees. Intelligence is the foundation of human power, the strength that fuels our other arts. Respect it. When someone talks about enhancing human intelligence or building smarter-than-human AI, pay attention. That is what matters to the future, not political yammering, not our little nation-tribes. In 200 million years nobody's going to give a damn who flew the first flying car or
  • by Dr. Zowie ( 109983 ) <slashdot.deforest@org> on Monday October 03, 2005 @11:48AM (#13705262)
    It will take at least 250 years before lifespans of 300 years are commonplace. By that time, it will seem, well, commonplace to live a long time -- it will be no big deal. Considering what life was like 250 years ago, and 250 years before that, it seems that we've already passed the singularity.

    The people of 1505 might have been rather impressed by societal change through 1755 (development of stock companies, the scientific method, the reformation) -- but the people of 1755 would be absolutely floored by the world of 2005.

  • by smackdotcom ( 136408 ) on Monday October 03, 2005 @12:32PM (#13705698)
    So here we go with another round of "The future's going to rock"/"The future's going to suck" debate. The utopian idealists versus the eco-depressive fetishists. If there's one thing I'm sure of, if we do someday go through a Singularity-type event, someone somewhere will be whining about how the benefits of it aren't distrbibuted with perfect equality.

    Let me speak as someone who actually has read the book, which I would assume sets me somewhat apart from most of the 'reviews' in this thread. Kurzweil's good and well worth reading if you want any idea at all as to where things will probably eventually go. I say probably because, of course, there are no guarantees (we could all get smacked by a massive comet tomorrow--this is not a forgiving universe). And I say eventually, because like so many others, I think Kurzweil's timeline is a bit optimistic. But when I say a bit optimistic, I mean by perhaps a decade or two, not centuries or millennia (Kurzweil addresses this all in depth in the book, and many of the comments on this thread make the very mistake he's trying to educate people out of--thinking in terms of linear progression when we're actually seeing exponential growth across a massive number of fronts). I think Kurzweil is being optimistic on a personal level due to his own age--the man's in his fifities, and no doubt worries about the odds of personally surviving to see such the radical shift that he is prognosticating and anticipating.

    What intrigues me most is the prospect for human enhancement. I consider this to be the most desirable, and perhaps even most inevitable, course towards the Singularity. We already have implants to allow deaf people to hear by tying directly into the auditory nerve (cochlear implants). We will follow that eventually with similar implants for vision, and eventually for other aspects of the brain itself. What will start as a humane effort to return normal function to those deprived of it will eventually permit us to merge with powerful computer systems, and gain the advantages that will come with that (imagine that your very imagination is augmented to include a high-powered CAD system, along with perfect memory recall, should you wish to use it). If we're smart, we can work to hone the best aspects of our humanity (our imagination, our sense of wonder, our empathy) while minimizing the worst of our nature (the primitive bloodlust that we carry as a result of our mammalian nature). Yes, yes, it could all go very wrong, but to those who point fingers towards nuclear weapons as evidence of our incorigibly beastly nature, I'd point out that they have been used only twice, and since the horror of their consequences have sunk in, they have not been used in anger since. Most people are good, decent folk. The eco-depressives strive to convince you otherwise, though the lack of mass suicide among the green folks is perhaps the best evidence that even they don't believe things are as utterly hopeless as they say. Yes, we have problems. No, they are not insurmountable, even with the technology that we have today, to say nothing of the technology we will have tomorrow.

    Enhancement of human intelligence also allows us to avoid most of the whole "Is Strong A.I. possible?" debate. By working to increase both the scope and scale of human intelligence, we're already working with a source of 'I', and are layering in the 'A', seeing what works and what doesn't. An evolutionary approach, if you will. Ultimately, I don't really know if it will be possible to transfer my thought processes from biological neurons to nanocircuitry, but besides the notion of a 'soul', I really don't see why it couldn't happen. As thinkers on the subject have pointed out, you lose brain cells all the time (even if you don't consume as much beer as the average engineering student), and yet you retain a sense of continuity with your past self. If you were to imagine a process that replaced your existing brain cells one at a time with artificial neurons that were functionally identical to the cells
  • by duckpoopy ( 585203 ) on Monday October 03, 2005 @02:02PM (#13706484) Journal
    So far all that humans have shown any proclivity for is eating, crapping and sleeping. Any thing more complicated that this just turns into a total clusterfark.
  • Maximum longevity (Score:4, Informative)

    by bradbury ( 33372 ) <<moc.liamg> <ta> <yrubdarB.treboR>> on Monday October 03, 2005 @02:11PM (#13706579) Homepage
    If one were to completely solve aging, the "average" longevity would be 2000-3000 years (limited by ones hazard function). If one adds to that nanotechnology based "enhancements" one is probably pushing 10,000 years. Taking uploading into account, ones lifespan could be trillions of years. Ray pushes the envelope but he may have some problems with where the actual limits are.

"Mach was the greatest intellectual fraud in the last ten years." "What about X?" "I said `intellectual'." ;login, 9/1990

Working...