Top Physicist Advocates Scientific Self-Censorship 365
spamania writes "The San Francisco Chronicle is running this article about a new book by Britain's astronomer royal, Sir Martin Rees, that advocates restricting scientific research in certain fields in the interest of public safety. In "Our Final Hour", Rees lends a sober, respectable voice to the oft-irrational ranting about nanotech, biotech, and other fields."
Don't restrict, classify (Score:5, Interesting)
Re:Don't restrict, classify (Score:2, Insightful)
Re:Latest US Government cover-ups and lies (Score:2)
Hearsay does not make fact. It is much better to cite sources than just say "may girlfriend read..."
Anyway, for those interested, here are some Agent Orange links (no claims as to the credibility of these):
http://home.att.net/~vetcenter/ao-nonew.htm [att.net]
http://www.cbc.ca/national/magazine/orange/ [www.cbc.ca]
http://www.cnn.com/WORLD/9604/13/agent_orange/ [cnn.com]
Re:Latest US Government cover-ups and lies (Score:2)
I'd give you the sources, but for the following facts:
1. It's 23.15 here, and she's safely tucked up in bed.
2. She has a cold and a sore throat, so that's the best place for her to be.
3. Given 1 and 2, I'm not waking her up just to ask her where she read something.
4. Sometimes, people can go do their own damn research (but thanks for providing them with a starting point).
Re:Latest US Government cover-ups and lies (Score:5, Insightful)
If memory serves, Agent Orange was a defoliant and not a chemical weapon. It's kind of like complaining about the Orkin man using chemical weapons: technically true but not really what is meant by a chemical weapon. Sure there were probably people in the jungles that were defoliated but its not anything like dropping a nice efficient nerve agent.
I'm really curious about how long it's going to take people to accuse the US military of chemical warfare because so many people are dying of lead poisoning.
Re:Latest US Government cover-ups and lies (Score:5, Interesting)
We had a lot of "innovative" weaponry in that era, like the Agents and a personal favorite of mine, antipersonnel mines loaded with slow-burning phosphorus/magnesium pellets instead of steel shrapnel. There were reports of the wounds of victims, who could take days to die, glowing sickly in the night.
Lovely stuff.
Re:Latest US Government cover-ups and lies (Score:3, Insightful)
Re:Latest US Government cover-ups and lies (Score:3, Interesting)
agent white is a narrowleaf herbicide I'm not sure of the comercial name but it kills grasses such as bambo and is an organic alcohol.
Toxicity of both agents are extremely low, but the manufacturer does recomment standard indusrtial hygene mesures when handling ether herbicide. So of course because both were being used in the same area somebody had the bright Idea to react them into a single compound by using a
Re:Latest US Government cover-ups and lies (Score:3, Informative)
The claim by the grandparent post was that the US that sold Saddam most of his chemical stockpile. Your references don't support that claim.
Re:Latest US Government cover-ups and lies (Score:4, Informative)
Re:Don't restrict, classify (Score:2, Insightful)
Yes, because we all know that nobody would ever leak classified research to a foreign government.
Re:Don't restrict, classify (Score:2)
Yes, leaks occur. However, if your enemies get their research by stealing it from you, then you are guaranteed to A) have the technology first, and B) understand it better.
That's a good sight better than a sharp stick in the eye.
Re:Don't restrict, classify (Score:2)
Actually, in the past, this was sometimes a good thing. It helped to keep things in balance and avoided the kind of unilateralism we see right now. Mutually assured destruction of two superpowers is indeed frightening, but it naturally restricts what one side can do to piss off the other.
In addition, we shouldn't dismiss the possibility that classified research is disclosed to allied foreign scientists on p
Re:Don't restrict, classify (Score:2, Interesting)
The words "Security through Obscurity" came to mind when I read your comment... but with scientific discoveries wouldn't it be "Security through Ignorance, until we discover it.
Re:Don't restrict, classify (Score:5, Insightful)
As for the "some experiments could destroy the earth" bit (really just a variant on There Are Things Man Was Not Meant To Know) IMO Rees is doing the typical crochety-old-scientist act. An awful lot of scientists who do brilliant work when they're younger seem to adopt an attitude of "Well, the search for knowledge was all well and good in my day, but you kids these days
I can't think of a single area of research in which the benefits of aggressive experimentation and open reporting don't outweight the risks. Not a single one. Biotech, nanotech, high-energy physics
Re:Don't restrict, classify (Score:2)
Re:Don't restrict, classify (Score:2)
I think the point was that certain experiments are so dangerous that they could destroy the whole planet if they go wrong.
Then it does not matter if it is classified; instead we should try to limit such research on a global scale.
Of course, one can have plenty of objections to the notion that some experiments may destroy the planet, but if you buy that argument, then the co
There's nothing quite like RTFA... (Score:5, Informative)
Hey, if you read the article then you would have understood Sir Martin Rees's reasons for recommending self-censorship. Here's a sample paragraph:
"Some experiments could conceivably threaten the entire Earth," he writes. "How close to zero should the claimed risk be before such experiments are sanctioned?"
He isn't talking about research that has potentially dangerous applications if it falls into the "wrong" hands, he's talking about potentially dangerous experiments. The kind of experiments where something going wrong could, say, create a minature black hole and thus destroy the planet.
When you're talking about an experiment going that wrong then you don't really give a damn who's performing it, "them" or "us".
Re:There's nothing quite like RTFA... (Score:5, Insightful)
>
> "Some experiments could conceivably threaten the entire Earth," he writes. "How close to zero should the claimed risk be before such experiments are sanctioned?"
>
> He isn't talking about research that has potentially dangerous applications if it falls into the "wrong" hands, he's talking about potentially dangerous experiments. The kind of experiments where something going wrong could, say, create a minature black hole and thus destroy the planet.
>
> When you're talking about an experiment going that wrong then you don't really give a damn who's performing it, "them" or "us".
Hey, if you look at cave paintings then you would grok Shaman Roa's big think for banish Caveman Og:
"Og's big fire think scary. Fire could burninate entire grassland where tribe hunt all meat things", Roa speak. "Fire come from Gods, not tribe! Roa know Gods, Roa eat happy mushrooms, talk to Gods every day! Og not talk to Gods, he too busy with fire think. Roa not want Og make Gods angry with two stick rubbing thing! Tell Og put sticks down!"
Og's fire think not scary-but-good because fire keep tigers away at night. What if Gods angry, make Og drop fire? Og burninate all grass! No grass, no antelope, no fruit! Whole world burninate! Like three rainy season ago when Gods sent fire from sky, burninated grassland! Half of tribe starve!
Og's fire think bad. Roa know! If Og not care what Roa think, Shaman Roa say send Og away forever!
Re:Don't restrict, classify (Score:2, Flamebait)
And it leaves you *dangerously* behind when these other nations are "bad guys". Not only will you not have comparable weapons for a deterrence to the nano-tech gene-mutation bomb which spontaneously turns you into a Communist, you will also not have any defenses against it. This is why the West still has plagues and pox's in a bottle, despite the wishes of the rainbow-flag brig
Re:Don't restrict, classify (Score:5, Interesting)
Restricting dangerous experiments to safe locations would. It seems to me that the professor is making a strong arguement for serious space colonization, for two reasons:
1) Doing some classes of nasty experiments on, say, neptune would greatly reduce the consequences to out of control experiments (e.g. nanobots and grey goo)
2) If the professor is right, that we only have a 50-50 chance of not destroying the earth in the "near" future, having a self sufficient backup colony or six would be prudent.
Technology (Score:5, Insightful)
Re:Technology (Score:2)
Re:Technology (Score:3, Insightful)
technology has potential to annihilate ... as well as the potential to protect us from annihilation.
It is circular to argue that a technology that can annihilate us can also be used to protect us from the annihilation that the technology causes.
And what if you can't tell which technology is which?
Re:Technology (Score:4, Insightful)
He didn't say to protect from annihilation caused by technology. It could protect us from a deadly natural disease, or a deadly meteor strike, or, in the long term, the deadly destruction of the Sun. If we stopped developing technology, as many technophopes would like, the human race is doomed. Our sun will eventually change so that human life can't live on earth. If we don't develope the tech to colonize other solar systems, we are all doomed.
Re:Technology (Score:3, Insightful)
Should we also stop research on cures for diseases just because we're afraid that the viruses spread over from medical labs?
Pandora's Box. (Score:5, Insightful)
I can see it now: "If nanotechnology is outlawed, only outlaws will have nanotechnology!"
Facetious, but nevertheless relevant.
RE: Pandora's Box (Score:2)
Yes, but the real question is: will it be financially feasible for anyone but a first world country or research institution to do it?
I'm sure Al Qaida would love to develop nanobots that proceed to liquefy any person who has Anglo-Saxon genes, but scanning-tunneling electron microscopes and other equipment cost money, dude.
When things get scary is when somebody produces some body of knowledge that allows a weapon to be reproduced for malicious
Re:Pandora's Box. (Score:2)
For fun, throw corporations funding and directing the respectable, civilized scientific research, into your mix, and then watch as we laugh, as we burn, in an all-consuming sub-atomic fire that rends our space-time continuum asunder! :D
You mean... (Score:2)
I thought we already covered all of this a long, long time ago.
Re:You mean... (Score:2, Insightful)
And who decides what should or should not be done..
I believe it has always been in the interest of man to protect against the effects of technology but not against the pursuit of technology.
I.E We can outlaw chemical weapons, and biological weapons because we know they are freakin' dangerous... but don't outlaw stem cell research or technology that we do not fully comprehend the effect of because we are scared of the "possible" consequences..
Anyways..
Re:You mean... (Score:2)
Only in the sense of "covered" that means "repeatedly hand-waved away with simplistic responses, usually without reading what the person actually has to say".
The possible downsides of technological advances and possible ways of ameliorating them are always worth discussing, but Slashdot is obviously not a good place for that to happen. Anyone got any pointers to places where real discussions like this can happen?
Re:You mean... (Score:2)
Sure, you can discuss them on Slashdot. Just come up with some more innovative points to discuss, such as nanotechnology health concerns from a disinterested or non-biased source. If you want a discussion beating a dead horse, Slashdot is also a plac
Re:You mean... (Score:2)
What most people twig to is if someone else can, and has a good chance of using it to beat you to death with the result, you should learn how to do it too.
Re nanobots, check out The Diamond Age for a few nasty ideas (cookie cutters, torture bots that go straight for the nervous system, plagues of bots to attack specific racial groups, etc). You don't know nanotech, you can't engineer your own hunter-killer bots to neutralise the opposing ones. This would be a Bad Thing
Good guys or bad guys? (Score:5, Insightful)
Not all scientists will self-censor, nor are all scientists working toward the greater good. Sometimes it's not their choice (see: Germany, 1940, and Iraq, 1988) to censor themselves.
Self-censorship? (Score:2, Offtopic)
Rabbit and the hare (Score:5, Insightful)
Re:Rabbit and the hare (Score:5, Insightful)
The more I think about it, the more I think that the only solution is a political one. Let me explain...
These days, our (or, at least, my) biggest WMD worry isn't about countries with nukes or countries with nerve agents... it's about individuals with them. There are too many people to keep track of, and the technology is becoming more and more accessible to individuals. The only way to keep them from actually using them in some act of terrorism is to keep them from wanting to.
Terrorism is often an option of last resort. I'm sure that Palestinian suicide bombers would prefer it if they could just make a compelling verbal argument for their cause and actually be listened to. It sure would save all the hassle of getting fitted for a torso-bomb. The problem, of course, is that they don't feel like anyone's really listening to them when they try any of the less-drastic-than-suicide-bombing methods of communication.
So, I think the only way to prevent acts of terrorism is to have everone in the world feel that, for the most part, they are being listened to... that their needs aren't being ignored. Now, I'm not saying that this is necessarily easy to do. I do feel, however, that individual acts of terrorism (whether it is some postal worker going berzerk with a firearm or some dude mailing anthrax to people in Washington D.C....) are going to steadily increase until people stop feeling like they're being treated like cattle....
Who restricts? (Score:4, Interesting)
Not censorship (Score:5, Insightful)
However, it's very hard to decide which avenues of research should be avoided. Biotechnology, Nanotechnology and all that promise great benefits, potentially helping us progress socially much faster (eliminating hunger and disease wouldn't do us much harm socially, would it?). The only ones that should clearly be avoided are clear-cut cases like nerve agents, genetic creation of deadly diseases, and all that. Otherwise, it makes little sense to restrain research in other directions...
Daniel
Re:Not censorship (Score:3, Interesting)
The sticky issue there is that you cannot scientifically classify anything as 'clear-cut'. It's never that black and white.
Personally, I think the opposite should happen. The more that's known about artificially created deadly diseases, for example, the more that's known about how to identify and cure them.
Re:Not censorship (Score:5, Interesting)
Biotech = bioweapons
Nanotech = nanoweapons
Nerve Agents = tranquilizers, stasis chambers
Creation of deadly disease = preemptively improving the immune system
What you consider good can be used for bad, and opposite. If I truly understand how the immune system works and want to extend and improve it to benefit mankind, I also have the knowledge of how to kill, by avoiding all its detection mechanisms, attack mechanisms, defense mechanisms, exploiting its flaws and weaknesses. All I'd have to know to go from vaccine to plague is how to make a replication method (e.g. by air/touch), which is trivial by comparison.
Kjella
Re:Not censorship (Score:5, Interesting)
I agree biotechnology and nanotechnology are certainly going to proceede and we should fund them. It is just certain high energy physics experiments should probably be thought about very carefully.
And that is the area in which Rees is most knowledgable: astro and particle physics (they interelate alot - note he is an astrophysicst and this kind of inquiry would not effect his field directly). I doubt he is as much of an expert on nanotech, but he included it somewhere in the end of his book as another place for inquiry.
Yes, the odds of disaster are really slim. Rees is asking, how far from zero should the odds be before we stop research? One in a million? One in a billion? What if there are (say) a million different permutations of the experiment, any of which could trigger the event?
It is pretty obvious to me that we should be thinking about these things and asking things like, don't these particles collide all the time in nature? (Say, in the Sun or near Black Holes, etc) and if the answer is yes, then is there a signal we could look for?
I'm sure people already are doing some back of the envelope calculations, but trying to get funding for this kind of work, as the above post so clearly indicates, is going to be a tough sell to parts of the public. Even the \. crowd who in general would be rather supportive of scientific funding.
Contrast with an earlier /. story... (Score:5, Insightful)
TheFrood
Censorship == Myopia (Score:3, Insightful)
Information longs to be free, and technology inherently desires improvement. If we don't allow certain scientific research, then this research will simply move to other countries, and the United States and its citizens will lose the opportunity to shape the methodoligies and goals of this research.
Perhaps a self-censorship system moderated by an international panel would work nicely, but it is utter foolishness (IMO) to let public opinion blindly dictate the direction of science. Enhancing the lives of the common citizen should always be the primary goal of science (IMO), but that doesn't mean that the public always/ever knows what is best.
I guess this is a step in the right direction. Have the most skilled in those fields moderate themselves. Sure, but I cringe whenever I see the words "censor" and "science" together in a sentence.
if this sort of 'logic' had prevailed... (Score:5, Insightful)
...we would still be living in caves. Seriously, because some things may lead to something which could be warped to 'bad' uses, we should halt the progress of science?
Knowledge on it's own can not be defined as 'good' or 'bad' - it just is. It is what we use the knowledge for that can be judged on a moral level. And what some people consider to be a 'good' use, other people may see as 'bad' or even 'evil' use of the knowledge.
Re:if this sort of 'logic' had prevailed... (Score:2)
Re:if this sort of 'logic' had prevailed... (Score:3, Funny)
Science is often dangerous (trust me, I've spilled a few drops of 6M HCl on myself), but usually the benefits outweigh. Sure I might create a deadly form of highly virulent, incredibly resistant, pathogen
Re:if this sort of 'logic' had prevailed... (Score:2)
I'm not anti research here, I'm just saying this is something to be very very careful with.
I'm not so sure about you, but I tend to believe that if you're smart enought to do that kind of research in the first place, you're smart enought to take all possible precations to guard against against accidents.
When the first atomic bomb was tested the scientist who built it was uncertain if the chain reaction would be contained, or if they might 'set fire to the sky'. They still decided to give it a try... b
Re:if this sort of 'logic' had prevailed... (Score:3, Funny)
Re:if this sort of 'logic' had prevailed... (Score:4, Insightful)
The grey goo nonsense is overblown and a nonissue. If this sort of calamity were possible, then it would have already happened in nature because some form of bacteria would have done it by now. THEY are capable of breaking down rock and other materials into building blocks to replicate themselves. THEY are autonomous and have their own energy supply.
Do a bit more research and you will find that there are solid arguments deconstructing the grey goo goobledigook and makes it go away.
Nuke research can lead to bombs that can kill most humans and other life on the planet, in theory (though not in practice). But it also leads to medical research that we all depend on. It leads to a nice way to generate power. It leads to deep space probes. It leads to the solving of crystal structures in structural biology. It leads to improvements in materials research.
All that good stuff isn't enough, however, to make up for the fact that you can make a juicy bomb too so we should never have gone down that path - and we wouldn't have all I mention above, nor would we have anything close to the understanding of the atomic world that we currently have.
Silence is telling (Score:5, Insightful)
What are the odds that someone of an unethical bent will NOT notice this and continue the research? Burning your papers and abandoning the research might buy some time. But someone will notice the silence.
Science is supposed to be the search for truth (Score:4, Interesting)
There shouldn't be any kind of censorship in this quest for knowledge, and this need to understand. I know I'm sounding like I've mixed philosophy with science, but lets not forget that science is an offshoot of philosophy.
So, just becasue some knowledge may potentially be dangerous, doesn't mean its knowledge we shouldn't pursue. That's like saying "you shouldn't learn how to use a gun, just because you might use a gun to kill someone!"
Re:Science is supposed to be the search for truth (Score:2)
I'm all for the Search for Truth, but wouldn't it be wise to temper that search with some patience and forethought to avoid destroying our entire civilization (at which point our search for truth would end far, far from the goal)?
Re:Science is supposed to be the search for truth (Score:2)
I believe that's closer to the good professor's argument.
Soko
Re:Science is supposed to be the search for truth (Score:3, Insightful)
Sure, I can handle the truth, but I don't have much use for it after I have been reduced to subatomic particles in the quest to find it.
CoDominium (Score:2)
Yes, some research leads to Bad Things(tm), but in the greater picture, research is a Good Thing(tm).
Information and research should be freely available to anyone. That is how greater discoveries are made (Warp Drive anyone?).
Maybe I see globalism in everything, but... (Score:3, Insightful)
Re:Maybe I see globalism in everything, but... (Score:3, Insightful)
When was the last time the general public was satisfied!?
How about.. (Score:5, Interesting)
Here's some food for thought. If we don't address these grievances, then how can Rees so arrogantly believe that his book is going to make a bit of difference? Does he think that they are incapable of research? Does he think that they are going to say," Gee, Rees wrote a book, maybe we shouldn't use this technology or do our own research." It might slow terrorism down, but it's a stupid price to pay. It will only delay the inevitable UNLESS we address the problems rather than dropping bombs. The only thing that his proposal might do is further along the police state mentality that seems to be moving along quite well here in the US. He certainly won't stop terrorism.
Even better solution! (Score:3, Funny)
-
Re:Even better solution! (Score:2)
We wouldn't have to steam all of the fun out of life if we just outlaw *learning*. That's where the real problem is.
If Science is Outlawed (Score:5, Insightful)
Maybe if we did away with the massive iniequalities that fuel destructive behavior we won't need to limit access to knowledge, because no one will have any reason to destroy. There may still be accidents, but limiting access to information because of possible accidents is like the proverbial ostrich sticking its head in the sand to escape detection. Just because the ostrich doesn't see the lion sneaking up on him doesn't mean he isn't about to become lunch.
Oh no, more Grey Goo worries! (Score:5, Insightful)
Sure, it's possible that when nanotechnology gets going, that somehow a nanomachine that can convert just about any material to energy and raw materials to copy itself could be accidentally created. It could then convert the entire Earth and everything on it to copies of itself. It's POSSIBLE.
But then again, it's also possible that some species of bacteria could mutate and start doing the same things. And it's probably not any less likely than a nanomachine doing it.
A machine that could convert just about anything on the planet into useful materials, and duplicate itself endlessly, would probably be difficult to make INTENTIONALLY, let alone accidentally. It would also be extremely easy to insert safeguards to prevent anything like that from happening. Either require the presence of a particular molecule for the machines to duplicate themselves. Add replication limits to the nanomachines. Never include self-replication in the same nanomachine as one that can break down most/all things into raw materials.
Unless nanoengineers are incredibly sloppy, maliciously so, then it's not going to happen by accident.
INTENTIONAL creation of such machines is an issue of higher importance. And the type of people who would make such nanomachines are not the type who are going to listen to people saying "we can't research/develop this technology, it might be dangerous". Would a law against using aircraft for suicidal terrorism have stopped Al Queda from taking down the WTC? Nope.
The best chance at preventing/defending against such actions is to develop the technology and focus some research on using it to prevent such uses. Not saying "stop all research!"
Now, I would be enormously in favor of a global treaty banning research into nanotechnological weapons. The thought of militaries working with such technologies does scare me.
I have to point out: (Score:2)
Now are you concerned? Hadn't realized what you were facing before, when you thought it was grey goo. I hope I've made my point.
Not the answer. (Score:3, Insightful)
What that actually means is, "Since we actually have the kind of restriant not to use this stuff, let's let someone with less restraint come up with it first."
When Einstein gave the US his aid in building an atomic weapon he did it on the principle that someone would discover it, and that it was MUCH better that it be us, than the Nazis. It's much better that we know, and can prepare, than it is for us to be caught flat footed by something so awful we didn't even let ourselves think about it.
Wrong (Score:2)
Re:Not the answer. (Score:5, Interesting)
It's wrong that Einstein worked on the bomb. His only involvment (as pointed out already) was writing a letter, that got dismissed, to Roosevelt. Einstein at the time was not liked, because of his roots. He was virtually exiled to the United States, because England didn't want him.
Also, that the reason why the Germans didn't have a nuclear bomb is because the allied forces destroyed (after a first failed mission) the heavy water factory in Switzerland (I think it was in Switzerland, not 100% sure) that was fundamental to the bomb design. Hindenberg was also much further along than the Allies, by years. The reason why Hindenberg was so slow in his development is because he was a practical physicist, and not theoretical, and thereby couldn't construct the most efficient shape for a sustained reaction.
Hindenbergs devices failed to reach critical mass, but they were very close, and had the Allied forces not resorted to sabotage, would have achieved it long before the Allies did.
The reason why Einstein wrote that letter is because he knew, logically, the Germans were developing the technology.
I think that the moral of the story is develop the technology first, as soon as you can, then create policy after realizing nobody should have that power. You can never know who is developing what, so it's better to develop everything.
The Arms Race is constantly ongoing, so is the Space Race, and all that jazz.
As I mentioned earlier in the thread, this boils down to, "Just because you can, doesn't mean you should." In regards to science, you always should, so you can protect yourself if someone else does.
Re:Not the answer. (Score:3, Informative)
The facility you refer to was in Norway [ehistory.com]; Switzerland was neutral in WWII.
Re:Hindenberg (Score:4, Funny)
Other historians have also suggested that his name may have been "Heisenberg".
Heisenberg? Are you sure? (Score:3, Funny)
There seems to be some, umm, what's the word,... uncertainty over the gentleman's name.
or vice-versa (Score:5, Interesting)
my problem with the point of view being taken by this prominent scientist is that he views all scientific propositions as risky, and there should be some generally agreed upon allowable risk threshold that any experiment should be considered against before it is carried out. The unfortunate thing about this point of view is that it doesn't take into account the potential benefits that could come out of it. Nano-bots destroying cencerous cells would truely make the fact that we live longer and longer much more worthwhile, if those extra years are cancer free, in my opinion. It is probably more worthwhile than creating blckholes on earth, even though the risks might be somewhere in the same range of dangerousness.
my second point, the nihilist one, is in regards to the 'gray goo' that nanotech could turn the planet into. could I stipulate that some sort of evolution could continue, but instead of carbon based cellular processes being the basis, the nanobots would be instead. just a thought.
A Quote (Score:2)
-Björk
Astronomer. Figures (Score:5, Insightful)
Re:Astronomer. Figures (Score:5, Insightful)
Stopping research in high energy physics would cripple research projects dealing with supernovae, cosmology, supermassive black holes, even cosmic ray research (and its affect on star formation) would likely be affected. And that's before we even start getting into the newer fields, like astrobiology.
Easy for him to say (Score:2, Funny)
Pizza and Picard... (Score:2, Interesting)
"...micro- robots that could reproduce out of control..."
"It could form a black hole -- an object with such immense gravitational pull that nothing could escape, not even light -- which would suck in everything around it."
"The quark particles might form a very compressed object called a strangelet, far smaller than a single atom, that c
time to debunk the black hole myth again (Score:4, Interesting)
I realize this isn't from you, it's from the article, but the rest of slashdot needs to realize this.
Suppose for a moment that you could replace the sun with a black hole of identical mass. Guess what would happen? Nope, we wouldn't get sucked in. It'd get dark, we'd probably be bathed in some pretty nasty radiation, but we'd still have exactly the same orbit.
Now suppose for a moment that we can warp the laws of physics enough to create an extremely small black hole, on the order of a few grams maybe (more like nano or picograms or smaller if it's in a particle accelerator). It would be a nasty little thing that wouldn't exist very long because there's no way to pump enough energy or matter into it fast enough to sustain it.
Basically, it only has "such immense gravitational pull" within its event horizon, and you need at least a couple solar masses to make a black hole. Last time I checked we didn't have that kind of mass just laying around. As for the strangelet, perhaps I don't have the understanding necessary to see how it could "infect" surrounding matter and compress the whole planet into something smaller than a football stadium. I mean it's not like it's SARS or anything. It's like he's saying "let's take the craziest, kookiest possibilities quantum physics has come up with, and assume they all happen in the worst possible way, etc."
Sixty years ago they were afraid that testing an atomic bomb might rip the entire planet apart, but went ahead with it anyway. They were some pretty smart people. Let's follow their lead.
Self-Censorship or Government Censorship (Score:2, Insightful)
There will be some control over technology like this, and we all should want there to be. Technology will make it
This guy is a scientist??? (Score:2, Insightful)
It is the nature of the human spirit to explore the unknown, and it shocks me that someone supposedly recognized as a scientist wold want to supress that spirit in any way. I can only conclude that he's forgotten why he became a scientist in the first place, but until he remembers, he probably needs a sabattical.
This guy needs to be beaten by the Romans (Score:2)
"What? How did they discover Conscription!?? I specifically repressed research in the area!!"
Civilization takes no prisoners.
Knowledge wants to be free (Score:2, Insightful)
Remember this famous quote (Score:3, Funny)
In Germany I first came for the Communists, and I didn't speak up because I wasn't a Communist.
Then I came for the Jews, and I didn't speak up because I wasn't a Jew.
Then I came for the trade unionists, and I didn't speak up because I wasn't a trade unionist.
Then I came for the Catholics, and I didn't speak up because I was a Protestant.
Then I came for me - and by that time I was the only one left in the room.
This is silly! (Score:2)
If this kind of thinking had been around before, we'd still be living in the stone age! No radio, microchips, nothing! No slashdot even!
hmm...
Chicken Little (Score:2)
dense (Score:3, Interesting)
-- The quark particles might form a very compressed object called a strangelet, "far smaller than a single atom," that could "infect" surrounding matter and "transform the entire planet Earth into an inert hyperdense sphere about 100 meters across."
Just when I thought that my cow-workers couldn't get any denser...
martin rees can suck it (Score:2)
Please RTFA (Score:2)
Well, if you RTFA, you will see that it is not so much concerned about technologies that can be used as powerful weapons, etc. Rather it cites experiments that in themselves are so dangerous that they could destroy the planet, by for example creating black wholes or nano-machines that replicate out of hand.
If these were feasible (personally I doubt it) then the conclusions drawn make a lo
Chicken little? (Score:3, Interesting)
-- The quark particles might form a very compressed object called a strangelet, "far smaller than a single atom," that could "infect" surrounding matter and "transform the entire planet Earth into an inert hyperdense sphere about 100 meters across."
-- Space itself, an invisible froth of subatomic forces and short-lived particles, might undergo a "phase transition" like water molecules that freeze into ice. Such an event could "rip the fabric of space itself. The boundary of the new-style vacuum would spread like an expanding bubble," devouring Earth and, eventually, the entire universe beyond it.
I remember that experiment. I am thinking that if the universe is that unstable, it would have been destroyed long ago. And the idea that that experiment could create a black hole is preposterous...Let's not forget what a black hole is - A huge amount of matter (generally from a very large collapsed star) compressed into a very small amount of space. In actuality it has no more or less than the original star (although as time goes on anything the black hole "sucks" in gets added to its total mass). I'm going to guess that it takes more than a few heavy atoms from a piddly experiment to form one.
As for the nanotech fears...Cowering in ignorance won't solve any problems. The last thing we need is the Good Guys thinking nanotech is bad and blacklisting it, while the Bad Guys are developing all kinds of nifty nanotech weapons.
It kind of is the same thing along the lines of the government and corporations locking up the white hats who are warning them about security flaws while the black hats are cracking the shit out of anything they want with impunity. It seems in their eyes white hats are nothing more than black hats who have confessed.
Gil Hamilton of the ARM (Score:3, Interesting)
It's in interesting philosophical question that has been around for a very long time. On one hand, the Catholic Church suppressed Galileo. Nobel invented dynamite, and as a result a lot of people died.
On the other hand, information about nuclear physics and the technology to build nuclear reactors (good) and nuclear weapons (bad) has been suppressed, with limited success, by those countries already in the Nuclear Club. As a result, so far, the terrorists have not yet (we hope) obtained nuclear weapons. September 11th could have been much, much worse if Al Quaeda had the "Islamic Bomb".
In fact, the ARM reminds me of the efforts of the US Government in suppressing cryptographic technology - classifying it as weaponry. And I can't say that the US is wrong. US efforts in breaking the Japanese codes were as responsible for the US victory at Midway as the Navy pilots themselves.
Yes, information wants to be free. So do children, but only irresponsible parents allow their children to run about unattended.
However, I feel that attempts to self-censor or otherwise suppress scientific research are doomed to failure. Information still wants to be free, and anyone who has ever watched "Connections" knows that science doesn't take logical paths - any innovation, however innocent, can result in something very very dangerous.
Some are called crackpots... (Score:3, Interesting)
While some of you may consider this view to be off-the-wall and not in accordance with "science" others in the field see it as a reasonable approach to take. No one has ever said we *won't* examine the unknown in any of the articles or lectures that I've ever been to that propose we limit certain areas of our research.
This reasoning isn't wholy unfounded either. Imagine if you will, the inventor of Kevlar strapping a bulletproof vest to his chest without adequate knowledge of its strength, telling his assistant to fire at point-blank range....and dying. My guess is instead they used a straw dummy and analyzed the problems that arose when the bullet penetrated it the first few times. We need that proverbial dummy in a lot of the aspects of biotechnology we're currently working on.
Imagine a virus that is capable of adapting in such a way as to avoid the human immune system in order to make germline changes so your children are not prone to an inheritable disease that you and your spouse would have passed on. Now imagine that it accidentally recombines with a flu viral genome you also had working your way through your body at the time of injection and propogates as an unknown disease agent. Not so implausible, given the latest news of the day.
Researchers in the 1970's instituted a moratorium on work with recombinant DNA until other methods and work had been done to better understand the implications of what we were working with at the time. This is no different. Just because you *can* do something, doesn't mean you necessarily should. There was an interesting talk by Dr. George Annas (a BioLaw professor at Boston University) at a recent conference entitled "The Future of Human Nature". Wired will be putting out an article on it. I'll try and get it submitted here on
In Dr. Annas' talk, he describes the need for a similar moratorium for germline meddling and what he describes as "species altering methods". Now, he was looking at 50-200 years in the future, but the idea that we might want to figure out how best to modulate our ability to develop new and interesting things with our realization that we're not always sure the outcome is still valid.
The closer we come to altering our own species, the worse the "oops" factor becomes. It's not crazy, it's an attempt at foresight...since hindsight could be far more costly with the types of things we are dealing with.
Gamma Ray Bursts? (Score:5, Funny)
Sure it's massively unlikely, but it would explain why we SETI hasn't heard anything yet.
Imagine if the first signal we decode is: "don't build a particle accelerator larger than 5 kilometers in diameter or you will destroy your whole world."
Times when this might be relevant (Score:4, Insightful)
So far there seem to have been a lot of replies complaining that it's silly to abandon research of dangerous topics, because if it's ignored then someone much worse will discover it first. I agree with this almost completely, but I think there are also times when it makes sense once a threshold has been reached where making things worse gains no strategic advantage.
The one I was thinking of was thermonuclear war. Before he died in 1996, Carl Sagan argued in The Demon Haunted World [amazon.com] (and probably other places) that the development of the Hydrogen Bomb by the US was strategically pointless, because it didn't accomplish or deter anything that couldn't already be accomplished or deterred by existing nuclear weapons. On the other hand instead of simply destroying an enemy, a thermonuclear war would induce a nuclear winter and wipe out most of the world. Furthermore, there wasn't any intelligence that the USSR was developing it, nor that they would have if the USA hadn't started.
Apart from that I'm not familiar with the whole situation, so I won't go into it further. But I don't think the argument that it's necessary to research ultra-dangerous topics before an enemy does holds up all the time -- especially when the only advance from existing technology is that it leads to a lose-lose scenario instead of a win-lose scenario.
did you read the article?? (Score:3, Insightful)
for once, there's virtually no rational comment to the article (at least out of the top-modded ones).
The point of Rees's statement is not that we must beware of developing a horribly powerful weapon. The point is that in the course of regular experimentation a horrible tragedy can occur. It is not that US must develop the BHM (black hole missile (tm)) before Syria, cause then they'll destroy the world, cause after all, they're bad guys that have black hate in their veins. The danger is that the black hole can happen *accidentally*. Thus, the argument "better us than them" is pointless. It is in no way mitigated by the fact that us refraining from destroying the world doesn't prevent others from doing it.
How real are the dangers of accidentally destroying the universe? If a top british physicist says they're real, i believe him. Virile nanobots? probably not, but its just an example, really.
Can self-sensure achieve desired goal? to some extent, you bet. No "underground organization" is going to build a particle accelerator for high energy physics. This stuff doesn't appear out of thin air, it takes BIG BUCKS. True, some doomsday methods are within easier reach (bio weapons in particular) But at least some of the more dangerous experiments can be avoided.
I repeat, "let's make a black hole before they do!" does not make sense/is not applicable.
Rationality shouldn't be abandoned, even in science. The hope may be faint (i think his 50-50 prognosis is optimistic) but its no reason not to try or to disparage the messenger.
not news (Score:3, Interesting)
Calculations showed otherwise, and things proceeded as expected. (Note: this may be apocrypal, as I can find no google reference to it and can't remember where I came across it -- but it makes the point as well as anything)
Just imagine if the theories or calculations had been inadequate to predict the results. Then look across the expanse of scientific history, and see how much of scientific knowledge has sprung from unexpected or unforeseen results.
All the author is saying is that the price of poker has gone up, and as we continue to push back the frontiers of ignorance, it's pretty much inevitable that we're going to step in something really ugly sooner or later. And with the capabilities humanity is poking at with sticks, the consequences of a major oops/surprise in a number of fields (high-energy physics, genetic tinkering/biowar, nanotech) are generally at least planet-wide in scope.
For the concerns involving alterations in the fabric of space-time or nature of reality, even off-world laboratories may offer insufficient protection.
Risk assessment is a very poorly understood discipline, easily corrupted by those who want to attain the goal and can't conceive of making a mistake. Look at how easily the NASA bureaucrats rationalize away the risks of the shuttle -- check out Feynman's appendix to the Challenger failure analysis report [fotuva.org] for some insight, and marvel at how his back-of-the-envelope calculation of 1:100 catastrophic failure rate still holds true today, and NASA management is still oblivious to the point he was trying to make.
Ooh, that's a good one (Score:3, Funny)
A: Marvellous, wonderful things. But for the good of humanity, I destroyed all my research.
Wonder how long I could get away with that?
Re:And this has... (Score:2, Informative)
Mostly the article talks about how certain scientific research could lead to catastrophic accidents that would be detrimental to our existance. So it's censorship in the idea that it wants to "remove or suppress what is considered morally, politically, or otherwise objectionable" in the sense that is it objectionable to the environment, world, universe, etc.
It's not saying that we should
Re:And this has... (Score:2)
No, that's a lifestyle choice unless you are realllllly concerned with the karmic wellbeing of car drivers. The calculated risk is to you alone.
A moral decision, an ethical choice or self-censorship is involved where you drive a bus load of children and warm puppies down that street which you suspect may be mined so you can get a blowjob from your favourite sex object if you make it to the other end.
Re:doom preacher (Score:3, Informative)
Not entirely. The issue is that if certain mathematical models of physics are correct, then these things are possible. That doesn't mean that he explained them well.
Of course, the energy we put into these accelerators isn't all that high by cosmic standards (think cosmic rays) so if stuff this bad could happen, we probably would have seen it already.
I prefer the nanobots (Score:5, Funny)
[homer voice]
Ummmm, gray goo.
[/homer voice]