Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science Technology

Bill Joy On His Own Future, And The World's 273

geeber writes "There is an interesting interview with Bill Joy in the current edition of the Magazine in the New York Times. He is still obssesed with what he calls a 'civilization-changing event' brought on by the fast pace of research into dangerous technologies such as genetic engineering and nanotechnology. Another interesting tidbit: he has flirted with the idea of going to work for Google."
This discussion has been archived. No new comments can be posted.

Bill Joy On His Own Future, And The World's

Comments Filter:
  • by The-Bus ( 138060 ) on Sunday June 06, 2004 @01:35PM (#9351321)
    No boogedy-boogedy NYT registatrion required
    here [nytimes.com].
  • by SeaDour ( 704727 ) on Sunday June 06, 2004 @01:35PM (#9351323) Homepage

    "Another interesting tidbit : he has flirted with the idea of going to work for Google."

    Really now, who these days hasn't thought about that? :D

  • Bill Joy??!!? (Score:3, Informative)

    by Richard_L_James ( 714854 ) on Sunday June 06, 2004 @01:37PM (#9351333)
    If like me you were wondering as always it appears wikipedia has the answer [wikipedia.org]... or so I thought !!!

    Sorry! The wiki is experiencing some technical difficulties, and cannot contact the database server

    Oh well never mind instead click here for a google cache of Bill's page on wikipedia [google.co.uk]

    • Re:Bill Joy??!!? (Score:2, Informative)

      by tdvaughan ( 582870 )
      The wiki admins on #wikipedia say it could be a day or more before the database is back up. Something to do with a forced killing of the mysql process.
    • by Anonymous Coward
      Wikipedia is working again. BTW, does anyone know how to donate some equipment there, they are doing better job than almost all of the open source projects, they deserve it and they definitely need it.
  • by gmuslera ( 3436 )
    The invention of knife was very dangerous too, a lot of people are killed by knifes and similar weapons. And a lot are saved by them too (scalpels and al). And for sure our life will be entirely different if we must eat without cutting accesories. You can't condemn entire tools or technologies because it could have some bad uses.
    • Knives don't have the potential to self-replicate and are therefore made as dictated by the pace of human industry. When the weapon itself becomes its own industry you concentrate the power into the people that control said machines only.
      • Genetic engineering is not the first self-replicating "technology" we used. Vaccines are (or were) somewhat atenuated diseases to activate our own body defense system. We put certain animals, insects, etc in some environment to make changes i.e. pest control (with very bad results [biotechnology.gov.au] sometimes). And we breed cattle, pets, etc to enhance certain characteristics from a long time ago.

        But I could fear more random mutation/genetic changes than engineered ones. In 1918 spanish flu killed 20 millon (probably more t

    • by Richard_L_James ( 714854 ) on Sunday June 06, 2004 @01:47PM (#9351389)
      It could be argued that all inventions can be put to good uses and a bad uses. e.g. Nuclear power, cars etc. A frying pan was a useful invention and yet they can be very dangerous during a domestic... ouch... erm gotta... ouch!...go... OOOOUCH!
    • It's amazing to read an article about someone like Bill Joy, a truly creative thinker and someone who accomplished a lot, and then come to Slashdot and read the most simplistic rebuttal that you'll likely read anywhere, and then see that it has been modded up.

      Now I understand why people just blog these days. You get away from this type of mediocrity.
      • by relativePositioning ( 661852 ) on Sunday June 06, 2004 @02:51PM (#9351725)
        The man is suggesting an end to the free flow of information that science is built upon. He talked about scientific "guilds" that would hold the sacred flame and hide it from everyone else in an effort to preserve the human race.

        I'm sorry, but that would sound like the end of at least interdisciplinary science if not science itself. I think the rubuttal that you labeled "simplistic" is pretty accurate. Just because the results of science can be used for destructive aims is not a reason to return to the ages of hidden knowledge.
      • come to Slashdot and read the most simplistic rebuttal that you'll likely read anywhere

        "anywhere" includes slashdot, so i wrote it because noone wrote that already :)

        And yes, is simplistic, but so still is condemning technologies because it could have a (ok, in this case very) bad uses, and closing the door on any kind of good uses, including avoiding or mitigating disasters even bigger than the worst that they could possibly make. If we go to the worst case scenario, when all the bad things will happen

      • by Jonathan ( 5011 ) on Sunday June 06, 2004 @03:41PM (#9351987) Homepage
        It's amazing to read an article about someone like Bill Joy, a truly creative thinker and someone who accomplished a lot

        He accomplished a lot in *programming*, nothing else. See, that's the problem. Once someone gets famous for doing X, they think they can speak authoritatively on all subjects. But they can't -- they can just babble, just as Einstein did about socialism and pacifism, and Bill Joy is doing about science. While we can all hold opinions on everything, and even babble about them on Usenet and Slashdot (or indeed on blogs, the most self-indulgent waste of time possible), it would be considerably more productive if people limited their interactions with journalists to the subjects they have actually been educated in.

        • Don't you think the fault lies primarily with the journalist, rather than the naively outspoken commentator? Or perhaps with the readers, for taking either of them seiously.
        • they can just babble, just as Einstein did about socialism and pacifism, and Bill Joy is doing about science.

          What, then, does someone have to do to gain your permission to talk about socialism or pacifism? Where does this wonderful "education" come from that allows you to be infallible in subjects like socialism and pacifism? Why should journalists only talk to these infallible experts?

          I'd rather hear Einstein's (and to a lesser degree Joy's) "babblings" than J. Random Blowhard on Slashdot.

          • What, then, does someone have to do to gain your permission to talk about socialism or pacifism?

            For starters, in the case of socialism a degree in economics and in the case of pacifism a degree in international relations. Of course, given this, I'd still take the opinions of a student fresh out of school with a grain of salt -- experienced economists and diplomats would be more convincing.

            I'd rather hear Einstein's (and to a lesser degree Joy's) "babblings" than J. Random Blowhard on Slashdot.

            And som
        • by fastdecade ( 179638 ) on Sunday June 06, 2004 @07:56PM (#9353244)
          Once someone gets famous for doing X, they think they can speak authoritatively on all subjects.

          This problem exists, but is not valid in this case.

          See, I'd agree if the interview was with Britney or Tiger - their opinion on the future counts for nothing. But you're talking about Bill Joy. When a deservedly prominent computer scientist - or, for that matter, biologist, economist, etc. - talks about the future, I'll listen.
        • ... they can just babble, just as Einstein did about [...] pacifism

          How did he babble? Remember that Einstein grew up between the world wars. An American WWI veteran said: "The Germans didn't win that war but neither did we. Only the war won that war." It was a house of cards that fell over, countries declared war because of their treaties and rarely because their own direct interests were at stake. And even the interests that were at stake, were more those of the elite than of the people. In the end, the
      • by Anonymous Coward
        "Now I understand why people just blog these days. You get away from this type of mediocrity."

        Snigger. Yeah. Nothing will save the world quite like furiously pounding out endless blog entries.
      • Has /. become stale? I hate to sound like one of those people who says "thiss place used to be cool, until it hit 1e6 members", because one could draw that line at 1e5, 1e4, 1e3 just as easily. I'm a great believer in the notion that quality, not quantity, is what matters. The moderation system and the meta-mod system are supposed to deal with that, and perhaps those methods achieve a good result up to a point (1e6?).

        But I have to agree, I often find that the tangents that get discussed are often irrelevan
    • by Ikn ( 712788 ) *
      I completely agree; the reason I've also argued pro-cloning research, pro-nano tech research, etc, is to compare it to the invention of flight. Yeah, bad things have come from it, but compare that to what it's done for mankind as a whole...could we ever see it's invention as a bad thing?
    • by October_30th ( 531777 ) on Sunday June 06, 2004 @02:12PM (#9351507) Homepage Journal
      A counter-argument could be made that not all dangerous things are made equal.

      Yes, knife can be useful but also dangerous.

      Explosives can be useful but very dangerous too. In the wrong hands they're definitely more dangerous than knives.

      Nuclear power can be useful but in general it's more dangerous (in the bomb form) than knives or explosives. It is, in fact, the first technology with which the human race could have committed a suicide.

      To me it seems like that to Joy genetic engineering and nanotechnology are one more order of magnitude more dangerous than atomic power or any other existing human technology. Why? Because of the potential for self-replication. Atomic bombs certainly kill lots of people, but they cannot self-replicate and run out of our control.

      In the end it boils down to the risk = probability * consequences. Even if the probability of us becoming victims to all-conquering grey nanogoo is vanishingly small, are the consequences so disasterous that the risk is eventually too high for us even experiment with the idea?

      Incidentally, developers of the hydrogen bomb had to wrestle with the same equation. What if we lit up a hydrogen bomb in our atmosphere and, against all our calculations and predictions, nitrogen-nitrogen fusion would begin and our entire atmosphere would be consumed in one huge fusion burn.

    • Well, Bill Joy isn't as simplistic as that. If you hear one of his speaches, he makes a very good point for his arguments.

      With a knife you can kill a few people. Fine. With a nuclear bomb, you can kill many, but YOU CAN CONTROL it! (not everyone has access nor capability to build a bomb) With every nut-case in the world being able to engineer viruses on their personal computers, you have the capability to wipe out the world... but this time, there is no control.
    • Joy seems to emphasize a critical difference in scale in the way science is pursued.

      Back when cavemen still said "ooga booga", maybe somebody figured out how to sharpen obsidian into a knife, and the other neanderthals spread the love.

      Thousands of years later we had guys like L. Da Vinci and then B. Franklin, renaissance men with who dabbled in science for the joy of their own genius.

      Now science is industrial (and so is science education, IMO). Much of it is driven by the search for profit (biotech) or

  • by Mr. Flibble ( 12943 ) on Sunday June 06, 2004 @01:39PM (#9351341) Homepage
    In James Watson's recent book "DNA, The Secret of Life" he touches on this problem. He mentions that the likelyhood of a nano-disaster is unlikely. His discussion is too lengthy to mention here (and I don't have the book in my hands right now) but it is a convincing counterpoint against this possibility.

    Also, one forgets that cells have been evolving against this possiblity for billions of years. If a "Gray Ooze" were possible it would very likely have appeared on its own. As it is, cells, and multi-cellular organisms have extremely sophisiticated (sp) means of defense. While will be possible to create a disease that kills millions or billions of humans, I worry far more about nuclear war.
    • While will be possible to create a disease that kills millions or billions of humans...

      This possibility has been dealt with at length in the novel, "Kalki" by Gore Vidal.

      • While will be possible to create a disease that kills millions or billions of humans...

        This possibility has been dealt with at length in the novel, "Kalki" by Gore Vidal.

        Are you seriously citing fiction by Gore Vidal as a reference on the subject?

    • Diseases do exist which can kill millions of people, haven't you noticed?

      AIDS, Bubonic plague, and I'm sure dozens of others I don't know about, either have or are currently killing millions of people. Barring medical breakthroughs, AIDS will kill every one of the 40 million people currently infected with it. The Bubonic plague wiped out a third of Europe, today with increased travel it could be a third of the world.

      Ewan
      • Barring medical breakthroughs, AIDS will kill every one of the 40 million people currently infected with it.

        Minor technical nit: One cannot be infected with AIDS. One is infected with HIV. AIDS is a syndrome generally associated with HIV infection, but HIV infection is not a surefire predictor of AIDS.

    • by ThisIsFred ( 705426 ) on Sunday June 06, 2004 @02:36PM (#9351634) Journal
      Also, one forgets that cells have been evolving against this possiblity for billions of years. If a "Gray Ooze" were possible it would very likely have appeared on its own.
      I agree that the possibility of accidentally creating something dangerous is probably low (e.g. a genetically engineered mushroom that suddenly mutates into a human-killing fungus). However, I don't think the evolutionary argument has sway in all possible examples, because the danger involves creating something with specific functions not guided by evolution, an entirely custom-built microorganism for example.


      There are two ways were I can possibly see that genetic engineering is potentially dangerous. The first is the chance that a genetically engineered microorganism that isn't dangerous to life on earth produces a byproduct that we didn't expect, which is dangerous to life. I highly doubt this would turn into catastrophe, since it's likely to be caught in the lab early on.

      The other possible danger is that some lab is contracted to produce an intentially harmful microorganism or virus. Just because it hasn't evolved yet doesn't mean it isn't possible to piece together something incredibly dangerous and nearly impossible counter. Evolution doesn't appear to cover all possible avenues, it only appears to cover those possible in the amount of time allowed before there is a major change to the environment. That said, the geological record appears to show lots of "false starts" that were cut short by earth-wide catastrophes. IAANB (biologist) so perhaps I'm missing the big picture.

      Anyway, my point is thus: We're far from the utopian promise of the future, and it will remain so because there is no single idea of perfection. War exists because one people want to force their political and social ideals on another people, even if there is no direct benefit for doing so. Against that backdrop, we've got biological weapons that look like the perfect WMD in a lab (chemical weapons and nukes don't reproduce when deployed, so they're less efficient), but turn out to be duds (luckily) when deployed. Imagine a virus which is airborne like the flu, destroys the immune system like HIV AIDS, can be spread by contact like a rhinovirus, but can be manufactured and stored almost indefinitely - unlike bacterial biological weapons. Assuming those traits aren't mutually exclusive, some agency, at some point, is likely to fund the research.
    • Devil's advocate (Score:3, Interesting)

      by mcc ( 14761 )
      1. Cells have been evolving against this possibility, however, the possibility has been evolving by the same mechanism. Lifeform immune systems are constrained in their ability to adapt by the evolutionary process. But so are viruses, so this isn't much of a problem. HOWEVER, nanotech works outside the evolutionary process. A nanotech virus developed in a lab could rise to a form such that no lifeform immune system has ever seen anything like it in a countable number of years, and from the perspective of "the
    • Nuclear reactions happen all the time in our universe, fueling our sun, and making the earth's core a molten blob of iron. So nuclear reactions are obviously nothing to worry about.

      Yet the Bomb WAS a human-created civilization-changing event that has nearly done us in on a few occasions, and may still do so.

      As to the Watson's assertion: chemical and biological weapons *do* exist. So why hasn't some predator evolved mustard gas jets to kill us off and take our food? Because evolution doesn't work all

  • by Anonymous Coward on Sunday June 06, 2004 @01:42PM (#9351360)
    Unfortunately Google did not share my same fantasy.
  • by hot_Karls_bad_cavern ( 759797 ) on Sunday June 06, 2004 @01:50PM (#9351400) Journal
    We've managed to survive the splitting of the atom in the last century, but have bred some very, very, very dangerous weapons while at the same itme developing some very, very important technologies. It's a wonder we've managed that so well (so far).

    i understand his concern over these new branches of study and it is of *dire* importance that we tread lightly and remember our lessons in the areas of genetic modification and nanotechnology, yet all the while moving forward. i'm no luddite, but i am always wary and respectful of the power of the human mind.
    • There's a difference (Score:4, Interesting)

      by Anonymous Coward on Sunday June 06, 2004 @02:24PM (#9351562)
      For me, the key difference is this: new technologies are giving individuals increasing destructive powers over more and more people, and it may be the equation we are all used to, about how tech can be used for good and bad, is changing.

      The knife enables you to kill a person at a time.
      A gun several.
      Bombs - hundreds
      Nukes are controlled by states, not individuals - but one fear behind the current war on terror is this will change.
      Nano weapons...?

      Weapons with gigantic destructive power might be very easy to synthesize in only 20 or 30 years - so imagine this: how do you run a world where every individual has the power to wipe out everyone else? There is no way around it - this is not like the right to bear arms - you simply have to ban the technology and pretty much wipe out everyone who seeks to acquire it, like an immune system killing viruses, while finding some way to lace the environment with 'antigens' of some kind that can automatically 'contain' any 'outbreaks'.

      There has to be a point at which a hugely destructive technology becomes so cheap and widely available that it cannot be allowed to proliferate, no matter that it might have beneficial uses.
      • You guys, mod that up, i'm very, very interested to see what some of our psychologists and futurists in the crowd have to say about that. Living in a world where everyone has the power to destroy everyone...the implications are blowing my mind away right now. Mental illnesses would HAVE to be solved and understood. Anger management would become one of the most important human attributes overnight. Wow. My mind really is reeling thinking about this. Yow.

        Let's have some thoughts folks!
        • Re:MOD PARENT UP. (Score:3, Interesting)

          by sydb ( 176695 ) *
          Let's have some thoughts folks!

          I thought "You seem very optimistic! This is slashdot, for crying out loud." Then I realised how negative I was being.

          I'm no psychologist, but a futurist is anyone with an opinion about tomorrow, so here goes.

          In a world where everyone has the power to destroy everyone else, we're already dead. There is no time to solve and understand mental illness. It only takes a handful of real loonies with access to total destruction weapons before we're all totally destroyed.

          So in m
          • Re:MOD PARENT UP. (Score:3, Insightful)

            by Hard_Code ( 49548 )
            The answer is to develop social feedback loops and an environment in which people generally /do not want to blow each other up/.

            Of course that sort of long term solution requires much more persistence, humility, dedication and sacrifice than packing lots of explosives into a bomb and just dropping it on people you don't like.

            I think we are starting to see this, even /without/ massive nano or biological weapon proliferation.

            If you have enough "AK-47 proliferation" it doesn't matter how many bombs you drop
            • Re:MOD PARENT UP. (Score:3, Insightful)

              by bit01 ( 644603 )

              generally

              Key word. The problem is that, given that everyone can blow everyone else up, in a world of 6,000,000,000 people all it takes is 0.00000001% deviants and we're doomed. No social system can be so perfect that every one of that many people will be well adjusted.

              Look at the present day; the number of terrorists in the world are statistically insignificant but there's still enough to cause all sorts of grief.

              That's not to say we shouldn't do everything we can to create a better world. It's just

          • Re:MOD PARENT UP. (Score:3, Insightful)

            by kryptkpr ( 180196 )
            It only takes a handful of real loonies with access to total destruction weapons before we're all totally destroyed.

            As a species, our technical intelligence far exceeds our common sense and mental stability. Evolutionary dead-end.


            What exactly do you mean by "technical intelligence" of our species? Do you mean the combined achievements of the human race? We've created the atom bomb, but 99.999% of people have no idea how it works and likely never will ..

            As far as common sense goes, the scenario is the
            • Re:MOD PARENT UP. (Score:3, Interesting)

              by sydb ( 176695 ) *
              What exactly do you mean by "technical intelligence" of our species? Do you mean the combined achievements of the human race? We've created the atom bomb, but 99.999% of people have no idea how it works and likely never will ..

              Yes that's pretty much what I mean; I used the word species because I was referring to the species, not the individuals.

              As far as common sense goes, the scenario is the excact opposite. The individual person has lots of common sense, but humans as a race have (almost) none.

              Again
              • You start to disagree with me, then you say exactly what I said, but with a slightly different choice of words. Why are you arguing with me when we agree so much?

                When I hear "real loonies", It conjures up an image of drugged out vegatables confined to white padded rooms in straightjackets. A "real loony" is someone that's born with most (if not all of) his/her screws loose.

                Fundamentalists are a much scarier breed. They live amongst us, and we don't know who they are. They have only 1 or 2 loose screws
            • "Why? So we can then proceed to destroy other worlds in the same way as we have ours?"
              It's that type of not caring about the fate of our species sentiment that causes all that destruction. The most important thing in our precarious stage is to get some self-sustaining colonies as far away from earth as possible.
              • It's that type of not caring about the fate of our species sentiment that causes all that destruction. The most important thing in our precarious stage is to get some self-sustaining colonies as far away from earth as possible.

                To a certain extent, I agree with you. But I think the real problem isn't that we don't care about the fate of our species, it's that our species doesn't care about the fate of any other species.

                Until we learn to play nice with others, and not piss in the pool we swim in, I don't
        • There is always a natural balance to things. We don't allow anyone to go out start producing fissible material.

      • how do you run a world where every individual has the power to wipe out everyone else?

        Hyperbole content of the above aside, I think the problem stems from the very idea that someone should be deciding how the world is to be run.

      • by nwbvt ( 768631 )
        "There is no way around it - this is not like the right to bear arms - you simply have to ban the technology and pretty much wipe out everyone who seeks to acquire it"

        There is an argument among 2nd amendment supporters that says "If you criminalize guns, only criminals will have guns". That applies here, only it is more powerful. Possibly the only way to counter a nano-plague is with your own nanotechnology. It is inevitable that someone will develop the technology if it is feasible and there is a desi

      • by Jeremi ( 14640 )
        There is no way around it - this is not like the right to bear arms - you simply have to ban the technology and pretty much wipe out everyone who seeks to acquire it

        That would be a good idea -- if it was even remotely possible. But of course it isn't, and banning the technology will only ensure that when the technology IS developed, it is only those who ignored your ban (i.e. your enemies) who have access to it. Good luck fighting that new plague when none of your scientists are allowed to research it!

        • A more workable (albeit still iffy) solution would be to figure out what makes people want to develop WMDs

          The solution is a Catch-22, IMO.

          People WANT to develop WMD's to increase their innate desire for more POWER. The alphamale/alphatribe that strove for more power got control of more scarce resources (and the women) so their genes & memes spread at the expense the "peacenik monkeys". This law of the jungle still lurks beneath the facade of our presentday civilization.

          Getting rid of our self-dest

          • I agree that unchecked aggressiveness was often an evolutionary advantage in the past, but as Bill Joy and others have noted, it's becoming more and more maladaptive as technological power increases. Fortunately, aggression and war aren't the only successful evolutionary strategies -- there are others [pbs.org] based on co-operation that can work as well or better, in the right circumstances. The trick is to provide those cirmcumstances, and then convince people to act in their own long term best interest.
      • The problem I have this argument is yet again people are ignoring history. People like Bill Joy should read history a bit more before attempting to sound profound.

        The problem I find all too often is that people do
        not acquaint themselves with history to know what the problems actually are. For germ warfare is not new. In fact it is over two hundred years old. Let me give an example http://www.somsd.k12.nj.us/~chssocst/ssgavittus1a m herstsmallpox.htm

        To beat the Indians instead of fighting them a genera
      • As long as the controlling body isn't the USA, the only ones who have attacked with nuclear weapons, ever, and the ones who are testing the waters of the media for tolerance of "tactical nukes", they're smaller don't worry. It would have to be a powerful group whose sole concern is the human race as a whole, not one country, ethnicity, economic circle etc, but the species itself.
      • by sybert ( 192766 )
        Knives -- 800,000 people killed in Rwanda with machetes, right under the UN's nose. The UN could care less because knives are not WMD's
        Over 100 Million people were slaughtered or executed by guns and knives so that Communists could stay in absolute power.
        Nukes -- The Bomb accounts for less than 1% of the WWII dead.
        Saddam's WMD's accounted for less than 10% of the people he butchered.
        Most current nuclear proliferation activity is directed over conflict in Israel/Palestine, where hundreds die a year.
      • by G-funk ( 22712 ) <josh@gfunk007.com> on Sunday June 06, 2004 @08:59PM (#9353526) Homepage Journal
        how do you run a world where every individual has the power to wipe out everyone else?

        Very, very politely :)
    • Space Travel (Score:3, Insightful)

      That's one of the arguments put forth against any ETs visiting us, any race of creatures technologically advanced enough to produce faster than light travel would have already blown themselves to peices with weapons (assuming a human-like nature).

    • by Saeger ( 456549 ) <farrellj@nosPAM.gmail.com> on Sunday June 06, 2004 @03:13PM (#9351842) Homepage
      In a nutshell, the problem with exponentially advancing technology [kurzweilai.net] is that it is increasingly outpacing our primitive human brain's ability to intelligently deal with it.

      Each new tech advance is more powerful and more accessible than the last, but the minds that wield it are relatively stagnant and still saddled with millions of years of selfish evolutionary baggage which we won't be able to fix [hedweb.com] for quite a while yet.

      Humankind is within ~30 years of reaching the vingean Singularity [caltech.edu], and the only question is the odds on making it [gmu.edu] without sabotaging ourselves first. IMO, the odds are very low, but unlike Bill Joy, I don't think there's any point in attempting to STOP or even slow this progress -- all we can do is try to safely guide the tech [foresight.org] and hope for the best.

      --

  • Fawlty Towers (Score:3, Interesting)

    by hedley ( 8715 ) <hedley@pacbell.net> on Sunday June 06, 2004 @01:58PM (#9351437) Homepage Journal
    There is a great story in Vanity Fair recently about a famous arch's two towers in NYC. Joy bought a two floor duplex. This building is plauged with problems. The list of who lives in them is a who's who of current celebritydom. (martha, calvin etc al) and then there's this geek, Bill Joy :) It made me laugh.

    Must be nice.

    Hedley
  • by John Miles ( 108215 ) * on Sunday June 06, 2004 @02:04PM (#9351464) Homepage Journal
    After all, it may be that self-destruction is not only our destiny as human beings, but our purpose.

    All facetiousness aside, his mention of Bertrand Russell's opposition to nuclear weapons raises a good point. Sure, we risked barbecuing ourselves during the Cold War. But, arguably, the same weapons also prevented World War III, and are continuing to do so. You could say that we traded an unimaginable amount of economic power -- strategic nuclear-weapons programs are, after all, the most expensive investment the human race has ever made -- for the very security that Joy says we're recklessly neglecting.

    At the end of the day, he'll just have to finish his manifesto and submit it for review by civilization at large. Even Ted Kaczynski managed to get that far.
    • strategic nuclear-weapons programs are, after all, the most expensive investment the human race has ever made


      Is that really true? How do you figure?

  • googling (Score:3, Funny)

    by Fullmetal Edward ( 720590 ) on Sunday June 06, 2004 @02:27PM (#9351582) Journal
    Oh how fun it'd be if he worked for google. Type in "Recipe for pasta salad" and you'd get 5 thousand pop ups going "THE WORLD IS GOING TO END! WE'RE ALL DOOMED!"
  • Grammar (Score:2, Funny)

    by atlasheavy ( 169115 )
    Magazine of the New York Times? That sounds so much more compelling than New York Times Magazine! Thanks for bringing this to my attention on the Organization of the Slashdot, geeber! ;-)
  • by phila60 ( 785993 ) on Sunday June 06, 2004 @02:36PM (#9351635)
    Many people mention that we have survived possible nuclear destruction and created hundreds of destructive weapons yet manage to live. You miss the point of those things beeing weapons, people weilding them were aware of extreme consequences their actions would bring. They had responsibility and while driven by their own agenda understood what they had on their hands. Great deal of effort was spent to keep it responsible, and less prone to get out due to single person/company/country mistakes/evil intent. What Bill argues is that there is a great possibility that now such responsiblities may fall on a limited group of people driven by money grabbing/get there fast/cheap mentality, or even a single person. No control as we have with nuclear technology, with consequences just as dire. He argues for responsible science. Just as there is a difference in responsible and secure code ( Linux/xBSD vs Microsoft). Its not a technology issie it is a people menatality issue, and is so much greatly illustrated by the quote given in the article from a book by Bertrand Russel: "I thought that people would not like the prospect of being fried with their families and their neighbors and every living person that they had heard of. I thought it would only be necessary to make the danger known and that, when this had been done, men of all parties would unite to restore previous safety. I found that this was a mistake. There is a motive which is stronger than self-preservation: it is the desire to get the better of the other fellow." This above is so true, and drives the market and human forces to get there fast, loosing a responsible approach in progress.
    • by TRACK-YOUR-POSITION ( 553878 ) on Sunday June 06, 2004 @08:30PM (#9353403)
      There is a motive which is stronger than self-preservation: it is the desire to get the better of the other fellow.

      Other people have stated this principle with different connotations than Russell chose to. There's Patrick Henry's extreme line "Give me Liberty or Give me Death." And if that's not far enough for you, Milton's Satan goes even further " "Better to reign in Hell, than serve in Heaven".

      You might wonder how anyone can entertain such fanatical positions. I think what you have to understand is that Choice, Power, Control, Freedom, Liberty--whatever synonym you choose to use--is the essence of Humanity. If you have lost the ability to act in pursuit of your wishes, then you as a human being are essentially dead. (Actually achieving your wishes is optional and possibly detrimental). The purpose of the 3 pounds of meat on top of bodies that drives us to do anything we are driven to do is to make decisions and act upon them. To be denied that ability is a fate worse than death.

      When we consider Bill Joy, we must consider what Bill Joy is asking us to surrender in order to avoid Grey Goo. To save the world, Bill Joy is not asking us to give up mere Science, Technology, or Geekdom. He is asking us to give up Democracy. Whether through a Science Guild, a government bureaucracy, or some strange all powerful insurance company, Bill Joy wants to put decisions over technology in the hands of some elite few--with the public completely uninformed that a decision has even been made--because public knowledge of the banned technology is dangerous.

      It is strange that he looked to insurance companies and the supposed "free market" to solve this problem. Anyone who equates capitalism with freedom should see this as a counter-example--money is a very old and straightforward means of Power. It is a Power Bill Joy is comfortable with--he is more comfortable with the dominance of Money than with the dangers of democracy or freedom, because he has Money.

      In any event, if bio and nano technology are going to be the driving forces of our economy in the future, what Bill Joy is suggesting is prohibiting the vast majority of people from participating in the that economic change. There will be an elite few, who posess the power of death over us, who are impervious to any threat we the people can offer them , and have will have the ability to deny us life saving or enriching technology as their whims so dictate.

      Bill Joy is asking us to adopt the teachings of Thomas Hobbes. I should hope that our prior experiences with absolute totalitarian power in history should be enough to dissuade us from that--we are weighing the possibility of destruction against the certainty of submission.

  • by vijaya_chandra ( 618284 ) on Sunday June 06, 2004 @02:36PM (#9351638)
    If it is going to be something like vi I would have no problems at all

    Thanks Mr.Joy for the joy called vi
  • Interesting links... (Score:5, Informative)

    by Lank ( 19922 ) on Sunday June 06, 2004 @02:53PM (#9351733)
    One of my professors this semester assigned a project comparing and contrasting the views of Joy, Dertouzos, and Kurzweil. The following articles shed some light about each one's perspective, respectively.
    http://www.wired.com/wired/archive/8.04/joy.html [wired.com]
    http://www.lcs.mit.edu/about/reason.html [mit.edu]
    http://www.lcs.mit.edu/about/kurzweil.html [mit.edu]
  • future fear (Score:4, Insightful)

    by tgibbs ( 83782 ) on Sunday June 06, 2004 @02:59PM (#9351770)
    I think that there are real risks of technology. But I'm not convinced that a "go slow" prescription is a solution. This presupposes that we actually can forecast the risks and benefits of technology if we just slow down the pace a bit. But so often, modern technologies synergize in ways that are nearly impossible to predict. And hypothetical risks often loom much larger than benefits. It was easy to foresee, for example, the risks to privacy of widespread computer connectivity. But who foresaw the many benefits of computer networks for commerce, communication, grass-roots political organization, etc., etc? Over the years, I've seen many nightmare scenarios. In early '70's, many young people were convinced that nuclear or ecological catastrophe would overtake us in just a few years. Yet somehow, the forecasted disasters always managed to stay just a few years ahead. It is worth thinking about risks--occasionally, the dangers are sufficiently obvious that they actually can be avoided. But that is the exception rather than the rule. I think the greater danger is that we will be paralyzed by fear and uncertainty.
  • Disaster Insurance (Score:2, Insightful)

    by nwbvt ( 768631 )
    The more I think about it, the more I like the proposed idea of having insurance policies for disasters involving dangerous technologies. The insurance companies will of course be subject to market forces and will thus be far more effective 'regulators' than bureaucrats in Washington who may have read a book on the technology they are regulating.
    • I think that Joy was essentially thinking out loud about some possible approaches. He mentioned guilds, insurance, bankruptcy, etc but the article also said "He wasn't satisfied he had come up with a comprehensive set of solutions" and that he dropped the book project.

      I agree with Joy that these approaches would not really solve the problem. For one thing, how are these enforced? The NYSE and Arthur Andersen apparently weren't even able to enforce any control over Enron. One can rebut: Andersen was punishe
  • by oogoody ( 302342 )
    >In a nutshell, the problem with exponentially
    >advancing technology [kurzweilai.net] is that it
    >is increasingly outpacing our primitive human
    >brain's ability to intelligently deal with it.

    What level of advance are you willing to put me in
    jail to protect? How do you decide on this level?
    How do you decide at any one time what fits under
    your arbitrary bar? Given the human nature
    you are so afraid of, i think we all know what
    direction this will go.

    What makes you think progress will continue at
    all i
  • Bill Joy is apparently a compulsive risk-mitigator. From the NYT article:

    "Joy is a film buff, and he recently outfitted his basement with a spectacular home entertainment system. He also happens to be a bibliophile, so he bought three handbooks -- ''Halliwell's Film Guide,'' Pauline Kael's ''5001 Nights at the Movies'' and the ''Time Out Film Guide'' -- to compare reviews. ''I was going through the books and found out there are only about 2,000 movies in history in which there's critical consensus that th

  • Backup civilization? (Score:4, Interesting)

    by colonist ( 781404 ) on Monday June 07, 2004 @12:32AM (#9354314) Journal

    Some people are seriously thinking of making 'backups' of civilization: "secure sanctuaries (think of the monasteries of the Middle Ages) that preserve and update copies of the vital records and articles needed for the conduct of our society". They would be placed all over Earth and eventually at locations in space. "In the event of a global catastrophe, the ARC facilities will be prepared to reintroduce lost technology, art, history, crops, livestock and, if necessary, even human beings to the Earth."

    See Robert Shapiro [edge.org] and William E. Burrows [arc-space.org]

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...