Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Top Physicist Advocates Scientific Self-Censorship 365

spamania writes "The San Francisco Chronicle is running this article about a new book by Britain's astronomer royal, Sir Martin Rees, that advocates restricting scientific research in certain fields in the interest of public safety. In "Our Final Hour", Rees lends a sober, respectable voice to the oft-irrational ranting about nanotech, biotech, and other fields."
This discussion has been archived. No new comments can be posted.

Top Physicist Advocates Scientific Self-Censorship

Comments Filter:
  • by headkase ( 533448 ) on Tuesday April 15, 2003 @04:38PM (#5739389)
    If research is truly dangerous then classify it. But not to research it only leaves you behind when other nations research it.
    • Classifying it is all well and good, but the government itself isn't spotless. Hell, when they detonated the first nuke 60-some years ago, they were pretty sure it wouldn't ignite the atmosphere or start a planet-destroying chain reaction.
    • If research is truly dangerous then classify it.

      Yes, because we all know that nobody would ever leak classified research to a foreign government.
      • Yes, leaks occur. However, if your enemies get their research by stealing it from you, then you are guaranteed to A) have the technology first, and B) understand it better.

        That's a good sight better than a sharp stick in the eye.

      • Yes, because we all know that nobody would ever leak classified research to a foreign government.

        Actually, in the past, this was sometimes a good thing. It helped to keep things in balance and avoided the kind of unilateralism we see right now. Mutually assured destruction of two superpowers is indeed frightening, but it naturally restricts what one side can do to piss off the other.

        In addition, we shouldn't dismiss the possibility that classified research is disclosed to allied foreign scientists on p
    • The words "Security through Obscurity" came to mind when I read your comment... but with scientific discoveries wouldn't it be "Security through Ignorance, until we discover it.

      • by Daniel Dvorkin ( 106857 ) on Tuesday April 15, 2003 @06:01PM (#5739988) Homepage Journal
        Exactly. There is no security in obscurity or ignorance. The only way to know how dangerous something is -- and to learn how to deal with it if it is dangerous -- is to study it.

        As for the "some experiments could destroy the earth" bit (really just a variant on There Are Things Man Was Not Meant To Know) IMO Rees is doing the typical crochety-old-scientist act. An awful lot of scientists who do brilliant work when they're younger seem to adopt an attitude of "Well, the search for knowledge was all well and good in my day, but you kids these days ..." Regrettable, but I suppose it's part of human nature.

        I can't think of a single area of research in which the benefits of aggressive experimentation and open reporting don't outweight the risks. Not a single one. Biotech, nanotech, high-energy physics ... yes, the risks are real, but the potential rewards are so great that it would be criminal either for scientists to restrict themselves or laws and/or social pressure to lay restrictions on them.
    • I agree all research should be done even if it may be dangerous. Only by researching it do you find ways to handle and/or contain it. I mean if we never researched dangerous diseases the world would be ravaged by them. We need a different kind of classification for some of these things in my opinion. Too much information is leaked that is very very dangerous.
    • If research is truly dangerous then classify it. But not to research it only leaves you behind when other nations research it.

      I think the point was that certain experiments are so dangerous that they could destroy the whole planet if they go wrong.

      Then it does not matter if it is classified; instead we should try to limit such research on a global scale.

      Of course, one can have plenty of objections to the notion that some experiments may destroy the planet, but if you buy that argument, then the co
    • by WIAKywbfatw ( 307557 ) on Tuesday April 15, 2003 @05:11PM (#5739662) Journal
      If research is truly dangerous then classify it. But not to research it only leaves you behind when other nations research it.

      Hey, if you read the article then you would have understood Sir Martin Rees's reasons for recommending self-censorship. Here's a sample paragraph:

      "Some experiments could conceivably threaten the entire Earth," he writes. "How close to zero should the claimed risk be before such experiments are sanctioned?"

      He isn't talking about research that has potentially dangerous applications if it falls into the "wrong" hands, he's talking about potentially dangerous experiments. The kind of experiments where something going wrong could, say, create a minature black hole and thus destroy the planet.

      When you're talking about an experiment going that wrong then you don't really give a damn who's performing it, "them" or "us".
      • by Tackhead ( 54550 ) on Tuesday April 15, 2003 @06:44PM (#5740264)
        > Hey, if you read the article then you would have understood Sir Martin Rees's reasons for recommending self-censorship. Here's a sample paragraph:
        >
        > "Some experiments could conceivably threaten the entire Earth," he writes. "How close to zero should the claimed risk be before such experiments are sanctioned?"
        >
        > He isn't talking about research that has potentially dangerous applications if it falls into the "wrong" hands, he's talking about potentially dangerous experiments. The kind of experiments where something going wrong could, say, create a minature black hole and thus destroy the planet.
        >
        > When you're talking about an experiment going that wrong then you don't really give a damn who's performing it, "them" or "us".

        Hey, if you look at cave paintings then you would grok Shaman Roa's big think for banish Caveman Og:

        "Og's big fire think scary. Fire could burninate entire grassland where tribe hunt all meat things", Roa speak. "Fire come from Gods, not tribe! Roa know Gods, Roa eat happy mushrooms, talk to Gods every day! Og not talk to Gods, he too busy with fire think. Roa not want Og make Gods angry with two stick rubbing thing! Tell Og put sticks down!"

        Og's fire think not scary-but-good because fire keep tigers away at night. What if Gods angry, make Og drop fire? Og burninate all grass! No grass, no antelope, no fruit! Whole world burninate! Like three rainy season ago when Gods sent fire from sky, burninated grassland! Half of tribe starve!

        Og's fire think bad. Roa know! If Og not care what Roa think, Shaman Roa say send Og away forever!

    • If research is truly dangerous then classify it. But not to research it only leaves you behind when other nations research it.

      And it leaves you *dangerously* behind when these other nations are "bad guys". Not only will you not have comparable weapons for a deterrence to the nano-tech gene-mutation bomb which spontaneously turns you into a Communist, you will also not have any defenses against it. This is why the West still has plagues and pox's in a bottle, despite the wishes of the rainbow-flag brig
    • by khb ( 266593 ) on Tuesday April 15, 2003 @05:54PM (#5739955)
      I don't think the good professor is purely concerned with bad people doing evil with science. From reading the article, it would seem that he is concerned that good people doing good research might inadvertently kill us all. So classification wouldn't help.

      Restricting dangerous experiments to safe locations would. It seems to me that the professor is making a strong arguement for serious space colonization, for two reasons:

      1) Doing some classes of nasty experiments on, say, neptune would greatly reduce the consequences to out of control experiments (e.g. nanobots and grey goo)

      2) If the professor is right, that we only have a 50-50 chance of not destroying the earth in the "near" future, having a self sufficient backup colony or six would be prudent.
  • Technology (Score:5, Insightful)

    by jetkust ( 596906 ) on Tuesday April 15, 2003 @04:38PM (#5739391)
    technology has potential to annihilate

    ...as well as the potential to protect us from annihilation.
    • True only if you survive the first phase.
    • Re:Technology (Score:3, Insightful)

      by s20451 ( 410424 )

      technology has potential to annihilate ... as well as the potential to protect us from annihilation.

      1. It is circular to argue that a technology that can annihilate us can also be used to protect us from the annihilation that the technology causes.

      2. And what if you can't tell which technology is which?

      • Re:Technology (Score:4, Insightful)

        by Lord Ender ( 156273 ) on Tuesday April 15, 2003 @06:50PM (#5740298) Homepage
        "It is circular to argue that a technology that can annihilate us can also be used to protect us from the annihilation that the technology causes."

        He didn't say to protect from annihilation caused by technology. It could protect us from a deadly natural disease, or a deadly meteor strike, or, in the long term, the deadly destruction of the Sun. If we stopped developing technology, as many technophopes would like, the human race is doomed. Our sun will eventually change so that human life can't live on earth. If we don't develope the tech to colonize other solar systems, we are all doomed.
  • Pandora's Box. (Score:5, Insightful)

    by Adolatra ( 557735 ) on Tuesday April 15, 2003 @04:39PM (#5739394) Homepage
    Does the word "Pandora's Box" ring a bell to any of these people? Once a science becomes feasible, it's going to be explored. Better it be done by respectable, civilized scientists than underground organizations of questionable ethical bent.

    I can see it now: "If nanotechnology is outlawed, only outlaws will have nanotechnology!"

    Facetious, but nevertheless relevant.

    • Once a science becomes feasible, it's going to be explored.

      Yes, but the real question is: will it be financially feasible for anyone but a first world country or research institution to do it?

      I'm sure Al Qaida would love to develop nanobots that proceed to liquefy any person who has Anglo-Saxon genes, but scanning-tunneling electron microscopes and other equipment cost money, dude.

      When things get scary is when somebody produces some body of knowledge that allows a weapon to be reproduced for malicious
    • "Once a science becomes feasible, it's going to be explored. Better it be done by respectable, civilized scientists than underground organizations of questionable ethical bent."

      For fun, throw corporations funding and directing the respectable, civilized scientific research, into your mix, and then watch as we laugh, as we burn, in an all-consuming sub-atomic fire that rends our space-time continuum asunder! :D

  • just because you can, doesn't mean you should?

    I thought we already covered all of this a long, long time ago.
    • Re:You mean... (Score:2, Insightful)

      by Anonymous Coward
      This has long been the useless prattle of luddites..

      And who decides what should or should not be done..

      I believe it has always been in the interest of man to protect against the effects of technology but not against the pursuit of technology.

      I.E We can outlaw chemical weapons, and biological weapons because we know they are freakin' dangerous... but don't outlaw stem cell research or technology that we do not fully comprehend the effect of because we are scared of the "possible" consequences..

      Anyways..
    • I thought we already covered all of this a long, long time ago.

      Only in the sense of "covered" that means "repeatedly hand-waved away with simplistic responses, usually without reading what the person actually has to say".

      The possible downsides of technological advances and possible ways of ameliorating them are always worth discussing, but Slashdot is obviously not a good place for that to happen. Anyone got any pointers to places where real discussions like this can happen?

      • The possible downsides of technological advances and possible ways of ameliorating them are always worth discussing, but Slashdot is obviously not a good place for that to happen. Anyone got any pointers to places where real discussions like this can happen?

        Sure, you can discuss them on Slashdot. Just come up with some more innovative points to discuss, such as nanotechnology health concerns from a disinterested or non-biased source. If you want a discussion beating a dead horse, Slashdot is also a plac
    • "In a galaxy far, far away..."? ;-)

      What most people twig to is if someone else can, and has a good chance of using it to beat you to death with the result, you should learn how to do it too.

      Re nanobots, check out The Diamond Age for a few nasty ideas (cookie cutters, torture bots that go straight for the nervous system, plagues of bots to attack specific racial groups, etc). You don't know nanotech, you can't engineer your own hunter-killer bots to neutralise the opposing ones. This would be a Bad Thing
  • by Mononoke ( 88668 ) on Tuesday April 15, 2003 @04:40PM (#5739413) Homepage Journal
    Wouldn't we rather have potentially evil discoveries made by folks that are on 'our side', rather than have the bad guys discover them first?

    Not all scientists will self-censor, nor are all scientists working toward the greater good. Sometimes it's not their choice (see: Germany, 1940, and Iraq, 1988) to censor themselves.

  • by gpinzone ( 531794 )
    Does that apply to Windows exploits as well?
  • by vwidiot ( 74513 ) on Tuesday April 15, 2003 @04:41PM (#5739425)
    Seems to me if you restrict research, not everybody will comply. This will lead to someone other than ourselves having a headstart on the research. The research will be done by SOMEONE so it might as well be us.
    • by jemenake ( 595948 ) on Tuesday April 15, 2003 @05:33PM (#5739814)
      Seems to me if you restrict research, not everybody will comply. This will lead to someone other than ourselves having a headstart on the research. The research will be done by SOMEONE so it might as well be us
      I think it's also a problem of, as soon as one (or a few) individuals "break rank" and start making great discoveries in those fields, then everyone will cave in. Interestingly, I think that this is partly why there's as much looting going on in Iraq right now. If you were a citizen who didn't really want to see a building looted, but you saw a bunch of your neighbors looting the place anyway, you're probably pretty likely to go get some for yourself because the alternative would still leave the place looted but your neighbors would end up with more stuff and you with less. Same goes with potentially harmful research.

      The more I think about it, the more I think that the only solution is a political one. Let me explain...

      These days, our (or, at least, my) biggest WMD worry isn't about countries with nukes or countries with nerve agents... it's about individuals with them. There are too many people to keep track of, and the technology is becoming more and more accessible to individuals. The only way to keep them from actually using them in some act of terrorism is to keep them from wanting to.

      Terrorism is often an option of last resort. I'm sure that Palestinian suicide bombers would prefer it if they could just make a compelling verbal argument for their cause and actually be listened to. It sure would save all the hassle of getting fitted for a torso-bomb. The problem, of course, is that they don't feel like anyone's really listening to them when they try any of the less-drastic-than-suicide-bombing methods of communication.

      So, I think the only way to prevent acts of terrorism is to have everone in the world feel that, for the most part, they are being listened to... that their needs aren't being ignored. Now, I'm not saying that this is necessarily easy to do. I do feel, however, that individual acts of terrorism (whether it is some postal worker going berzerk with a firearm or some dude mailing anthrax to people in Washington D.C....) are going to steadily increase until people stop feeling like they're being treated like cattle....

      ... and that requires political solutions, not technological ones.
  • Who restricts? (Score:4, Interesting)

    by rbp ( 10897 ) <rbp@isnomoreFORTRAN.net minus language> on Tuesday April 15, 2003 @04:43PM (#5739435) Homepage
    IMO, the main problem with suggesting this sort of restriction is, who restricts? The same research might be considered dangerous to some people and necessary by others. The same apply to "moral", of course. In the end, it's all in the hands of humans. To decide which areas should be restricted, or to use science for evil, or to do evil while doing science etc.
  • Not censorship (Score:5, Insightful)

    by KDan ( 90353 ) on Tuesday April 15, 2003 @04:43PM (#5739438) Homepage
    The usual over-sensationalistic /. headline is, as usual, over-sensationalistic. This is not censorship, but self-control and self-direction. It's not about not publishing things which exist and have been researched (that would be censorship), but about deliberately avoiding avenues of research which are too dangerous given our current rather low level of social evolution.

    However, it's very hard to decide which avenues of research should be avoided. Biotechnology, Nanotechnology and all that promise great benefits, potentially helping us progress socially much faster (eliminating hunger and disease wouldn't do us much harm socially, would it?). The only ones that should clearly be avoided are clear-cut cases like nerve agents, genetic creation of deadly diseases, and all that. Otherwise, it makes little sense to restrain research in other directions...

    Daniel
    • "The only ones that should clearly be avoided are clear-cut cases like nerve agents, genetic creation of deadly diseases, and all that."

      The sticky issue there is that you cannot scientifically classify anything as 'clear-cut'. It's never that black and white.

      Personally, I think the opposite should happen. The more that's known about artificially created deadly diseases, for example, the more that's known about how to identify and cure them.
    • Re:Not censorship (Score:5, Interesting)

      by Kjella ( 173770 ) on Tuesday April 15, 2003 @05:28PM (#5739776) Homepage
      However, it's very hard to decide which avenues of research should be avoided. Biotechnology, Nanotechnology and all that promise great benefits, potentially helping us progress socially much faster (eliminating hunger and disease wouldn't do us much harm socially, would it?). The only ones that should clearly be avoided are clear-cut cases like nerve agents, genetic creation of deadly diseases, and all that. Otherwise, it makes little sense to restrain research in other directions...

      Biotech = bioweapons
      Nanotech = nanoweapons
      Nerve Agents = tranquilizers, stasis chambers
      Creation of deadly disease = preemptively improving the immune system

      What you consider good can be used for bad, and opposite. If I truly understand how the immune system works and want to extend and improve it to benefit mankind, I also have the knowledge of how to kill, by avoiding all its detection mechanisms, attack mechanisms, defense mechanisms, exploiting its flaws and weaknesses. All I'd have to know to go from vaccine to plague is how to make a replication method (e.g. by air/touch), which is trivial by comparison.

      Kjella
    • Re:Not censorship (Score:5, Interesting)

      by ggwood ( 70369 ) on Tuesday April 15, 2003 @07:43PM (#5740611) Homepage Journal
      I'm not sure you can "over-sensationalize" the prospect of the whole Earth being turned into a 100 meter sphere of inert goo.

      I agree biotechnology and nanotechnology are certainly going to proceede and we should fund them. It is just certain high energy physics experiments should probably be thought about very carefully.

      And that is the area in which Rees is most knowledgable: astro and particle physics (they interelate alot - note he is an astrophysicst and this kind of inquiry would not effect his field directly). I doubt he is as much of an expert on nanotech, but he included it somewhere in the end of his book as another place for inquiry.

      Yes, the odds of disaster are really slim. Rees is asking, how far from zero should the odds be before we stop research? One in a million? One in a billion? What if there are (say) a million different permutations of the experiment, any of which could trigger the event?

      It is pretty obvious to me that we should be thinking about these things and asking things like, don't these particles collide all the time in nature? (Say, in the Sun or near Black Holes, etc) and if the answer is yes, then is there a signal we could look for?

      I'm sure people already are doing some back of the envelope calculations, but trying to get funding for this kind of work, as the above post so clearly indicates, is going to be a tough sell to parts of the public. Even the \. crowd who in general would be rather supportive of scientific funding.
  • by TheFrood ( 163934 ) on Tuesday April 15, 2003 @04:43PM (#5739439) Homepage Journal
    This makes an interesting counterpoint to an article from last week [slashdot.org] about an editorial by Sheldon Pacotti, one of the designers of Deus Ex. Rees seems to think self-censorship is the best defense, while Pacotti thinks it's best to spread the knowledge far and wide, so that everybody has the information necessary to devise defenses against technological threats.

    TheFrood
  • by Alric ( 58756 ) <slashdot&tenhundfeld,org> on Tuesday April 15, 2003 @04:47PM (#5739462) Homepage Journal
    I have never understood "banning" certain types of research. What do we hope to accomplish?

    Information longs to be free, and technology inherently desires improvement. If we don't allow certain scientific research, then this research will simply move to other countries, and the United States and its citizens will lose the opportunity to shape the methodoligies and goals of this research.

    Perhaps a self-censorship system moderated by an international panel would work nicely, but it is utter foolishness (IMO) to let public opinion blindly dictate the direction of science. Enhancing the lives of the common citizen should always be the primary goal of science (IMO), but that doesn't mean that the public always/ever knows what is best.

    I guess this is a step in the right direction. Have the most skilled in those fields moderate themselves. Sure, but I cringe whenever I see the words "censor" and "science" together in a sentence.
  • by WegianWarrior ( 649800 ) on Tuesday April 15, 2003 @04:47PM (#5739463) Journal

    ...we would still be living in caves. Seriously, because some things may lead to something which could be warped to 'bad' uses, we should halt the progress of science?

    Knowledge on it's own can not be defined as 'good' or 'bad' - it just is. It is what we use the knowledge for that can be judged on a moral level. And what some people consider to be a 'good' use, other people may see as 'bad' or even 'evil' use of the knowledge.

    • given the choice between living in a cave and being beaten to death by killer robots, I will take the cave.
      • Being beaten to death by killer robots by far outweighs any plans I've had for my own death. Then again I work in a lab with "Warning: RADIATION" on the door, with my back to the rad hood and frequently handle EtBr and other substances classed as potentially dangerous mutagens so uhm... put on some gloves.

        Science is often dangerous (trust me, I've spilled a few drops of 6M HCl on myself), but usually the benefits outweigh. Sure I might create a deadly form of highly virulent, incredibly resistant, pathogen
  • Silence is telling (Score:5, Insightful)

    by elwoodblues16 ( 666185 ) on Tuesday April 15, 2003 @04:48PM (#5739473)
    Let's say a group of scientists run into an area of research that looks to be potentially catastrophic. They hush it up, destroy their research documents, and go off in some other direction of research.

    What are the odds that someone of an unethical bent will NOT notice this and continue the research? Burning your papers and abandoning the research might buy some time. But someone will notice the silence.

  • by tempestdata ( 457317 ) on Tuesday April 15, 2003 @04:48PM (#5739476)
    People like Einstein dedicated their entire lives to find truth. Find, "The answer". So what's the matter? Can't handle the truth?

    There shouldn't be any kind of censorship in this quest for knowledge, and this need to understand. I know I'm sounding like I've mixed philosophy with science, but lets not forget that science is an offshoot of philosophy.

    So, just becasue some knowledge may potentially be dangerous, doesn't mean its knowledge we shouldn't pursue. That's like saying "you shouldn't learn how to use a gun, just because you might use a gun to kill someone!"

    • Your evaluation of the matter is entirely too simplistic. As the article mentions, we are progressing to the point in our scientific evolution where an accident during an experiment MIGHT potentially destroy the Earth, if not the entire Universe.

      I'm all for the Search for Truth, but wouldn't it be wise to temper that search with some patience and forethought to avoid destroying our entire civilization (at which point our search for truth would end far, far from the goal)?

    • How about "you should learn how to use a gun, just because you might accidentaly use a gun to kill someone if you don't know how to handle it safely!"

      I believe that's closer to the good professor's argument.

      Soko
    • Sure, killing someone with a gun is bad. But is it as bad as ANNIHILATING THE ENTIRE PLANET just so that you can figure out this whole black hole concept that's eluding physicists? Have some foresight.

      Sure, I can handle the truth, but I don't have much use for it after I have been reduced to subatomic particles in the quest to find it.

  • This sounds like the beginnings of the CoDominium.

    Yes, some research leads to Bad Things(tm), but in the greater picture, research is a Good Thing(tm).

    Information and research should be freely available to anyone. That is how greater discoveries are made (Warp Drive anyone?).
  • by mcworksbio ( 571932 ) on Tuesday April 15, 2003 @04:48PM (#5739488)
    "No decision to go ahead with an experiment... should be made unless the general public is satisfied..." An interesting question is not simply the scientific realities of dooms-day science but the implied obligation of all people to the worldwide community. It seems as the years pass we get closer to having a serious discussion, as citizens of our individual nations, as to whether our responsibilities lie with our own flag or a "global" identity.
  • How about.. (Score:5, Interesting)

    by composer777 ( 175489 ) on Tuesday April 15, 2003 @04:48PM (#5739489)
    addressing the grievances that might cause a certain group to use technology to do harm? Or am I supposed to believe that we are the only rational ones and the rest of the world is full of savages that need to be tamed? Our viewpoint of other countries sounds alot like present day colonialism if you ask me.

    Here's some food for thought. If we don't address these grievances, then how can Rees so arrogantly believe that his book is going to make a bit of difference? Does he think that they are incapable of research? Does he think that they are going to say," Gee, Rees wrote a book, maybe we shouldn't use this technology or do our own research." It might slow terrorism down, but it's a stupid price to pay. It will only delay the inevitable UNLESS we address the problems rather than dropping bombs. The only thing that his proposal might do is further along the police state mentality that seems to be moving along quite well here in the US. He certainly won't stop terrorism.
  • by Alsee ( 515537 ) on Tuesday April 15, 2003 @04:48PM (#5739490) Homepage
    Why not just outlaw reading? Make it punishable with a death penalty.

    -
    • Why not just outlaw reading? Make it punishable with a death penalty.

      We wouldn't have to steam all of the fun out of life if we just outlaw *learning*. That's where the real problem is.

  • by spun ( 1352 ) <loverevolutionary&yahoo,com> on Tuesday April 15, 2003 @04:51PM (#5739513) Journal
    Only outlaws will have science. By restricting access to certain types of research, we limit knowledge in those fields, making it more likely that we will not be able to discover antidotes to technological mishaps. Will it reduce the chance of those mishaps? I doubt it. If the process of scientific discovery was exact and well known, perhaps, but simply limiting information won't stop progress. Who knows where crucial breakthroughs in, say, nanotechnology will come from? If we limit access to scientific knowledge off all fields that might lead to the development of "grey goo" we will stagnate, and won't garauntee that "grey goo" won't get made. All we will garauntee is that we won't know how to fight it if it does get made.

    Maybe if we did away with the massive iniequalities that fuel destructive behavior we won't need to limit access to knowledge, because no one will have any reason to destroy. There may still be accidents, but limiting access to information because of possible accidents is like the proverbial ostrich sticking its head in the sand to escape detection. Just because the ostrich doesn't see the lion sneaking up on him doesn't mean he isn't about to become lunch.
  • by Saige ( 53303 ) <evil,angela&gmail,com> on Tuesday April 15, 2003 @04:51PM (#5739516) Journal
    I am really getting frustrated by the amount of traction the whole "grey goo" meme is getting.

    Sure, it's possible that when nanotechnology gets going, that somehow a nanomachine that can convert just about any material to energy and raw materials to copy itself could be accidentally created. It could then convert the entire Earth and everything on it to copies of itself. It's POSSIBLE.

    But then again, it's also possible that some species of bacteria could mutate and start doing the same things. And it's probably not any less likely than a nanomachine doing it.

    A machine that could convert just about anything on the planet into useful materials, and duplicate itself endlessly, would probably be difficult to make INTENTIONALLY, let alone accidentally. It would also be extremely easy to insert safeguards to prevent anything like that from happening. Either require the presence of a particular molecule for the machines to duplicate themselves. Add replication limits to the nanomachines. Never include self-replication in the same nanomachine as one that can break down most/all things into raw materials.

    Unless nanoengineers are incredibly sloppy, maliciously so, then it's not going to happen by accident.

    INTENTIONAL creation of such machines is an issue of higher importance. And the type of people who would make such nanomachines are not the type who are going to listen to people saying "we can't research/develop this technology, it might be dangerous". Would a law against using aircraft for suicidal terrorism have stopped Al Queda from taking down the WTC? Nope.

    The best chance at preventing/defending against such actions is to develop the technology and focus some research on using it to prevent such uses. Not saying "stop all research!"

    Now, I would be enormously in favor of a global treaty banning research into nanotechnological weapons. The thought of militaries working with such technologies does scare me.
  • Not the answer. (Score:3, Insightful)

    by SatanicPuppy ( 611928 ) <Satanicpuppy@nosPAm.gmail.com> on Tuesday April 15, 2003 @04:52PM (#5739517) Journal
    So, what he's saying is, "We could find lots of horrible and dangerous things if we keep researching in this direction, so we shouldn't do it."

    What that actually means is, "Since we actually have the kind of restriant not to use this stuff, let's let someone with less restraint come up with it first."

    When Einstein gave the US his aid in building an atomic weapon he did it on the principle that someone would discover it, and that it was MUCH better that it be us, than the Nazis. It's much better that we know, and can prepare, than it is for us to be caught flat footed by something so awful we didn't even let ourselves think about it.
    • Einstein never worked on the bomb! His involvement was strictly limited to writing a letter, asking Roosevelt to develop the bomb.
    • Re:Not the answer. (Score:5, Interesting)

      by Xerithane ( 13482 ) <xerithane.nerdfarm@org> on Tuesday April 15, 2003 @05:21PM (#5739730) Homepage Journal
      When Einstein gave the US his aid in building an atomic weapon he did it on the principle that someone would discover it, and that it was MUCH better that it be us, than the Nazis. It's much better that we know, and can prepare, than it is for us to be caught flat footed by something so awful we didn't even let ourselves think about it.

      It's wrong that Einstein worked on the bomb. His only involvment (as pointed out already) was writing a letter, that got dismissed, to Roosevelt. Einstein at the time was not liked, because of his roots. He was virtually exiled to the United States, because England didn't want him.

      Also, that the reason why the Germans didn't have a nuclear bomb is because the allied forces destroyed (after a first failed mission) the heavy water factory in Switzerland (I think it was in Switzerland, not 100% sure) that was fundamental to the bomb design. Hindenberg was also much further along than the Allies, by years. The reason why Hindenberg was so slow in his development is because he was a practical physicist, and not theoretical, and thereby couldn't construct the most efficient shape for a sustained reaction.

      Hindenbergs devices failed to reach critical mass, but they were very close, and had the Allied forces not resorted to sabotage, would have achieved it long before the Allies did.

      The reason why Einstein wrote that letter is because he knew, logically, the Germans were developing the technology.

      I think that the moral of the story is develop the technology first, as soon as you can, then create policy after realizing nobody should have that power. You can never know who is developing what, so it's better to develop everything.

      The Arms Race is constantly ongoing, so is the Space Race, and all that jazz.

      As I mentioned earlier in the thread, this boils down to, "Just because you can, doesn't mean you should." In regards to science, you always should, so you can protect yourself if someone else does.
      • Re:Not the answer. (Score:3, Informative)

        by 3waygeek ( 58990 )
        Also, that the reason why the Germans didn't have a nuclear bomb is because the allied forces destroyed (after a first failed mission) the heavy water factory in Switzerland (I think it was in Switzerland, not 100% sure) that was fundamental to the bomb design.

        The facility you refer to was in Norway [ehistory.com]; Switzerland was neutral in WWII.
  • or vice-versa (Score:5, Interesting)

    by DenOfEarth ( 162699 ) on Tuesday April 15, 2003 @04:52PM (#5739520) Homepage
    I find it hard to believe that it will ever be possible to totally stop the entire human race from pursuing research into certain fields. If there's something to be learned, we'll learn it; if there's something to figure out, we'll figure it out, or die trying (probably not the best cliche to use, but oh well). I just have two points, a practical one, and a nihilist one.

    my problem with the point of view being taken by this prominent scientist is that he views all scientific propositions as risky, and there should be some generally agreed upon allowable risk threshold that any experiment should be considered against before it is carried out. The unfortunate thing about this point of view is that it doesn't take into account the potential benefits that could come out of it. Nano-bots destroying cencerous cells would truely make the fact that we live longer and longer much more worthwhile, if those extra years are cancer free, in my opinion. It is probably more worthwhile than creating blckholes on earth, even though the risks might be somewhere in the same range of dangerousness.

    my second point, the nihilist one, is in regards to the 'gray goo' that nanotech could turn the planet into. could I stipulate that some sort of evolution could continue, but instead of carbon based cellular processes being the basis, the nanobots would be instead. just a thought.

  • All the Modern Things have always existed, They've just been waiting.

    -Björk
  • by wowbagger ( 69688 ) on Tuesday April 15, 2003 @04:54PM (#5739537) Homepage Journal
    I find it interesting that this man is an astronomer. I guess he figures that his particular branch of science will never be considered "dangerous" and need to be "limited", unlike those other blighters in physics.
    • by JetJaguar ( 1539 ) on Tuesday April 15, 2003 @05:16PM (#5739699)
      That's not even remotely true. Astronomy would be greatly curtailed by this as well. A large portion of current astronomy relies very heavily on results from high energy physics, particularly cosmology.

      Stopping research in high energy physics would cripple research projects dealing with supernovae, cosmology, supermassive black holes, even cosmic ray research (and its affect on star formation) would likely be affected. And that's before we even start getting into the newer fields, like astrobiology.

  • by PD ( 9577 )
    It's not like a supernova makes a very practical weapon...
  • Pizza and Picard... (Score:2, Interesting)

    by PSaltyDS ( 467134 )
    Mr. Rees obviously ate too much pizza before falling asleep during the Star Trek Marathon. With a little pulling out of context, imagine CDR Data saying these things:

    "...micro- robots that could reproduce out of control..."

    "It could form a black hole -- an object with such immense gravitational pull that nothing could escape, not even light -- which would suck in everything around it."

    "The quark particles might form a very compressed object called a strangelet, far smaller than a single atom, that c
    • by kilonad ( 157396 ) on Tuesday April 15, 2003 @07:34PM (#5740551)
      "It could form a black hole -- an object with such immense gravitational pull that nothing could escape, not even light -- which would suck in everything around it."

      I realize this isn't from you, it's from the article, but the rest of slashdot needs to realize this.

      Suppose for a moment that you could replace the sun with a black hole of identical mass. Guess what would happen? Nope, we wouldn't get sucked in. It'd get dark, we'd probably be bathed in some pretty nasty radiation, but we'd still have exactly the same orbit.

      Now suppose for a moment that we can warp the laws of physics enough to create an extremely small black hole, on the order of a few grams maybe (more like nano or picograms or smaller if it's in a particle accelerator). It would be a nasty little thing that wouldn't exist very long because there's no way to pump enough energy or matter into it fast enough to sustain it.

      Basically, it only has "such immense gravitational pull" within its event horizon, and you need at least a couple solar masses to make a black hole. Last time I checked we didn't have that kind of mass just laying around. As for the strangelet, perhaps I don't have the understanding necessary to see how it could "infect" surrounding matter and compress the whole planet into something smaller than a football stadium. I mean it's not like it's SARS or anything. It's like he's saying "let's take the craziest, kookiest possibilities quantum physics has come up with, and assume they all happen in the worst possible way, etc."

      Sixty years ago they were afraid that testing an atomic bomb might rip the entire planet apart, but went ahead with it anyway. They were some pretty smart people. Let's follow their lead.
  • I'm going to go out on a limb and say this is a good idea. I don't think most of us would question whether emerging fields such as biotechnology could open up a pretty nasty pandoras box for people with bad intentions. No one wants the small group of unbalanced bad guys to be making super-smallpox in clandestine labs located in some state that would be unable or unwilling to counter them.

    There will be some control over technology like this, and we all should want there to be. Technology will make it
  • Sounds to me more like someone who is living in perpetual fear of tomorrow.

    It is the nature of the human spirit to explore the unknown, and it shocks me that someone supposedly recognized as a scientist wold want to supress that spirit in any way. I can only conclude that he's forgotten why he became a scientist in the first place, but until he remembers, he probably needs a sabattical.

  • then he'll be singing a different tune.


    "What? How did they discover Conscription!?? I specifically repressed research in the area!!"


    Civilization takes no prisoners.

  • I can't beleive anyone in the scientific community would ever consider the issue of self censorship. Knowlege is supposed to be used for the betterment of human kind, even dangerous military technology has been used for the benefit of all (think nuclear power). Would any of us really be so naive as to beleive that if someone in the biotech industry had developed a great genetic code for new amazing eyes that would let us see in the ifra-red spectrum and with amazing accuracy and clairty, that the code for
  • by Anonymous Coward on Tuesday April 15, 2003 @05:07PM (#5739628)
    Words to consider before this head-long rush into self-censorship:

    In Germany I first came for the Communists, and I didn't speak up because I wasn't a Communist.
    Then I came for the Jews, and I didn't speak up because I wasn't a Jew.
    Then I came for the trade unionists, and I didn't speak up because I wasn't a trade unionist.
    Then I came for the Catholics, and I didn't speak up because I was a Protestant.
    Then I came for me - and by that time I was the only one left in the room.
  • After all, there's nothing stopping a scientist being a terrorist. Censorship only services to hide things from the general public.

    If this kind of thinking had been around before, we'd still be living in the stone age! No radio, microchips, nothing! No slashdot even!

    hmm... :)
  • The idea that humans would ever be capable of doing experiments capable of destroying the universe is laughable. Supermassive black holes have been 'experimenting' with energy ranges and densities that we can never hope to achieve. Furthermore, in an infinite universe like ours, some alien civilization somewhere already would have destroyed it were it possible. Small black holes made by scientists will solve the problem of their own existence by evaporating. The dangers of nanotechnology have also been o
  • dense (Score:3, Interesting)

    by fence ( 70444 ) on Tuesday April 15, 2003 @05:14PM (#5739684) Homepage
    I particularly enjoyed this one:

    -- The quark particles might form a very compressed object called a strangelet, "far smaller than a single atom," that could "infect" surrounding matter and "transform the entire planet Earth into an inert hyperdense sphere about 100 meters across."

    Just when I thought that my cow-workers couldn't get any denser...
  • I love physics, and my fellow physics people, but why do so many physicists start saying really dumb things when they get old? No, Dr. Rees, we're not going to get together and decide to halt exploration because of the potential negative consequences.
  • There are a lot of highly moderated posts making the point that if we don't research this, then somebody else will.

    Well, if you RTFA, you will see that it is not so much concerned about technologies that can be used as powerful weapons, etc. Rather it cites experiments that in themselves are so dangerous that they could destroy the planet, by for example creating black wholes or nano-machines that replicate out of hand.

    If these were feasible (personally I doubt it) then the conclusions drawn make a lo
  • Chicken little? (Score:3, Interesting)

    by retro128 ( 318602 ) on Tuesday April 15, 2003 @05:57PM (#5739970)
    -- It could form a black hole -- an object with such immense gravitational pull that nothing could escape, not even light -- which would "suck in everything around it."

    -- The quark particles might form a very compressed object called a strangelet, "far smaller than a single atom," that could "infect" surrounding matter and "transform the entire planet Earth into an inert hyperdense sphere about 100 meters across."

    -- Space itself, an invisible froth of subatomic forces and short-lived particles, might undergo a "phase transition" like water molecules that freeze into ice. Such an event could "rip the fabric of space itself. The boundary of the new-style vacuum would spread like an expanding bubble," devouring Earth and, eventually, the entire universe beyond it.

    I remember that experiment. I am thinking that if the universe is that unstable, it would have been destroyed long ago. And the idea that that experiment could create a black hole is preposterous...Let's not forget what a black hole is - A huge amount of matter (generally from a very large collapsed star) compressed into a very small amount of space. In actuality it has no more or less than the original star (although as time goes on anything the black hole "sucks" in gets added to its total mass). I'm going to guess that it takes more than a few heavy atoms from a piddly experiment to form one.

    As for the nanotech fears...Cowering in ignorance won't solve any problems. The last thing we need is the Good Guys thinking nanotech is bad and blacklisting it, while the Bad Guys are developing all kinds of nifty nanotech weapons.

    It kind of is the same thing along the lines of the government and corporations locking up the white hats who are warning them about security flaws while the black hats are cracking the shit out of anything they want with impunity. It seems in their eyes white hats are nothing more than black hats who have confessed.

  • by DavidBrown ( 177261 ) on Tuesday April 15, 2003 @06:32PM (#5740194) Journal
    There is a certain amount of sense to the idea of restricting scientific research. Larry Niven's early-Tales of Known Space charactor "Gil Hamilton" worked for a UN agency called the A.R.M. that, amongst other things, suppressed scientific research - keeping the results for themselves in case "secret weapons" were needed in the future.

    It's in interesting philosophical question that has been around for a very long time. On one hand, the Catholic Church suppressed Galileo. Nobel invented dynamite, and as a result a lot of people died.

    On the other hand, information about nuclear physics and the technology to build nuclear reactors (good) and nuclear weapons (bad) has been suppressed, with limited success, by those countries already in the Nuclear Club. As a result, so far, the terrorists have not yet (we hope) obtained nuclear weapons. September 11th could have been much, much worse if Al Quaeda had the "Islamic Bomb".

    In fact, the ARM reminds me of the efforts of the US Government in suppressing cryptographic technology - classifying it as weaponry. And I can't say that the US is wrong. US efforts in breaking the Japanese codes were as responsible for the US victory at Midway as the Navy pilots themselves.

    Yes, information wants to be free. So do children, but only irresponsible parents allow their children to run about unattended.

    However, I feel that attempts to self-censor or otherwise suppress scientific research are doomed to failure. Information still wants to be free, and anyone who has ever watched "Connections" knows that science doesn't take logical paths - any innovation, however innocent, can result in something very very dangerous.

  • by Kaz Riprock ( 590115 ) on Tuesday April 15, 2003 @07:36PM (#5740571)

    While some of you may consider this view to be off-the-wall and not in accordance with "science" others in the field see it as a reasonable approach to take. No one has ever said we *won't* examine the unknown in any of the articles or lectures that I've ever been to that propose we limit certain areas of our research.

    This reasoning isn't wholy unfounded either. Imagine if you will, the inventor of Kevlar strapping a bulletproof vest to his chest without adequate knowledge of its strength, telling his assistant to fire at point-blank range....and dying. My guess is instead they used a straw dummy and analyzed the problems that arose when the bullet penetrated it the first few times. We need that proverbial dummy in a lot of the aspects of biotechnology we're currently working on.

    Imagine a virus that is capable of adapting in such a way as to avoid the human immune system in order to make germline changes so your children are not prone to an inheritable disease that you and your spouse would have passed on. Now imagine that it accidentally recombines with a flu viral genome you also had working your way through your body at the time of injection and propogates as an unknown disease agent. Not so implausible, given the latest news of the day.

    Researchers in the 1970's instituted a moratorium on work with recombinant DNA until other methods and work had been done to better understand the implications of what we were working with at the time. This is no different. Just because you *can* do something, doesn't mean you necessarily should. There was an interesting talk by Dr. George Annas (a BioLaw professor at Boston University) at a recent conference entitled "The Future of Human Nature". Wired will be putting out an article on it. I'll try and get it submitted here on /. but in the meantime, if you're interested, keep your eyes open for it.

    In Dr. Annas' talk, he describes the need for a similar moratorium for germline meddling and what he describes as "species altering methods". Now, he was looking at 50-200 years in the future, but the idea that we might want to figure out how best to modulate our ability to develop new and interesting things with our realization that we're not always sure the outcome is still valid.

    The closer we come to altering our own species, the worse the "oops" factor becomes. It's not crazy, it's an attempt at foresight...since hindsight could be far more costly with the types of things we are dealing with.
  • by ggwood ( 70369 ) on Tuesday April 15, 2003 @07:55PM (#5740690) Homepage Journal
    Perhaps that is the origin of gamma ray bursts: civilizations turning their planets into 100 meter diameter spheres with really powerful particle accelerators.

    Sure it's massively unlikely, but it would explain why we SETI hasn't heard anything yet.

    Imagine if the first signal we decode is: "don't build a particle accelerator larger than 5 kilometers in diameter or you will destroy your whole world."
  • by jesterzog ( 189797 ) on Tuesday April 15, 2003 @08:18PM (#5740815) Journal

    So far there seem to have been a lot of replies complaining that it's silly to abandon research of dangerous topics, because if it's ignored then someone much worse will discover it first. I agree with this almost completely, but I think there are also times when it makes sense once a threshold has been reached where making things worse gains no strategic advantage.

    The one I was thinking of was thermonuclear war. Before he died in 1996, Carl Sagan argued in The Demon Haunted World [amazon.com] (and probably other places) that the development of the Hydrogen Bomb by the US was strategically pointless, because it didn't accomplish or deter anything that couldn't already be accomplished or deterred by existing nuclear weapons. On the other hand instead of simply destroying an enemy, a thermonuclear war would induce a nuclear winter and wipe out most of the world. Furthermore, there wasn't any intelligence that the USSR was developing it, nor that they would have if the USA hadn't started.

    Apart from that I'm not familiar with the whole situation, so I won't go into it further. But I don't think the argument that it's necessary to research ultra-dangerous topics before an enemy does holds up all the time -- especially when the only advance from existing technology is that it leads to a lose-lose scenario instead of a win-lose scenario.

  • by lonevoice ( 666290 ) on Tuesday April 15, 2003 @09:56PM (#5741375)
    not to draw flame on myself, but....
    for once, there's virtually no rational comment to the article (at least out of the top-modded ones).

    The point of Rees's statement is not that we must beware of developing a horribly powerful weapon. The point is that in the course of regular experimentation a horrible tragedy can occur. It is not that US must develop the BHM (black hole missile (tm)) before Syria, cause then they'll destroy the world, cause after all, they're bad guys that have black hate in their veins. The danger is that the black hole can happen *accidentally*. Thus, the argument "better us than them" is pointless. It is in no way mitigated by the fact that us refraining from destroying the world doesn't prevent others from doing it.

    How real are the dangers of accidentally destroying the universe? If a top british physicist says they're real, i believe him. Virile nanobots? probably not, but its just an example, really.

    Can self-sensure achieve desired goal? to some extent, you bet. No "underground organization" is going to build a particle accelerator for high energy physics. This stuff doesn't appear out of thin air, it takes BIG BUCKS. True, some doomsday methods are within easier reach (bio weapons in particular) But at least some of the more dangerous experiments can be avoided.

    I repeat, "let's make a black hole before they do!" does not make sense/is not applicable.
    Rationality shouldn't be abandoned, even in science. The hope may be faint (i think his 50-50 prognosis is optimistic) but its no reason not to try or to disparage the messenger.
  • not news (Score:3, Interesting)

    by constantnormal ( 512494 ) on Wednesday April 16, 2003 @12:51AM (#5742074)
    These kinds of fears have been around for a while. When the first hydrogen bomb was exploded at the Bikini Atol, there was some concern that the level of deuterium in sea water was sufficient to sustain a fusion reaction in the oceans.

    Calculations showed otherwise, and things proceeded as expected. (Note: this may be apocrypal, as I can find no google reference to it and can't remember where I came across it -- but it makes the point as well as anything)

    Just imagine if the theories or calculations had been inadequate to predict the results. Then look across the expanse of scientific history, and see how much of scientific knowledge has sprung from unexpected or unforeseen results.

    All the author is saying is that the price of poker has gone up, and as we continue to push back the frontiers of ignorance, it's pretty much inevitable that we're going to step in something really ugly sooner or later. And with the capabilities humanity is poking at with sticks, the consequences of a major oops/surprise in a number of fields (high-energy physics, genetic tinkering/biowar, nanotech) are generally at least planet-wide in scope.

    For the concerns involving alterations in the fabric of space-time or nature of reality, even off-world laboratories may offer insufficient protection.

    Risk assessment is a very poorly understood discipline, easily corrupted by those who want to attain the goal and can't conceive of making a mistake. Look at how easily the NASA bureaucrats rationalize away the risks of the shuttle -- check out Feynman's appendix to the Challenger failure analysis report [fotuva.org] for some insight, and marvel at how his back-of-the-envelope calculation of 1:100 catastrophic failure rate still holds true today, and NASA management is still oblivious to the point he was trying to make.

  • by Rogerborg ( 306625 ) on Wednesday April 16, 2003 @07:52AM (#5743120) Homepage
    Q: So, what have you achieved this month, loyal peon?
    A: Marvellous, wonderful things. But for the good of humanity, I destroyed all my research.

    Wonder how long I could get away with that?

"Pull the trigger and you're garbage." -- Lady Blue

Working...