Forgot your password?
typodupeerror
Science

Ask Slashdot: What Are the Most Dangerous Lines of Scientific Inquiry? 456

Posted by samzenpus
from the don't-even-ask dept.
gbrumfiel writes "The battle over whether to publish research into mutant bird flu got editors over at Nature News thinking about other potentially dangerous lines of scientific inquiry. They came up with a non-definitive list of four technologies with the potential to do great good or great harm: Laser isotope enrichment: great for making medical isotopes or nuclear weapons. Brain scanning: can help locked-in patients to communicate or a police state to read minds. Geoengineering: could lessen the effects of climate change or undermine the political will to fight it. Genetic screening of embryos: could spot genetic disorders in the womb or lead to a brave new world of baby selection. What would Slashdotters add to the list?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: What Are the Most Dangerous Lines of Scientific Inquiry?

Comments Filter:
  • Nanotechnology (Score:3, Informative)

    by Anonymous Coward on Thursday April 26, 2012 @08:03PM (#39814633)

    Can you say Gray Goo?

  • In other words... (Score:4, Informative)

    by pushing-robot (1037830) on Thursday April 26, 2012 @08:04PM (#39814639)

    Ask Slashdot: What's your favorite Sci-Fi apocalypse?

    • Re:In other words... (Score:5, Interesting)

      by Guppy (12314) on Thursday April 26, 2012 @09:01PM (#39815329)

      Speaking of Sci-Fi, the lead female character (Mira) in the book "Evolution's Darling [kirkusreviews.com]" is an assassin who targets scientists that have been judged by Mira's AI-overlords as being too close to making undesirable discoveries.

      For instance, one of her past targets included a researcher working on teleportation (which they calculate will lead to the collapse of civilization), and much of the story involves her mission to assassinate a rogue AI who has developed a method of making perfect copies of AI minds. All for the protection of society of course.

  • by abigor (540274) on Thursday April 26, 2012 @08:07PM (#39814683)

    Where I live, certain ethnic minorities (actually, taken together they are actually a majority) are notorious for screening embryos for gender. Then they abort the females until a male is born first. It's become such an issue that it's now illegal to specify an embryo's gender until the window for legal abortion has passed (I don't remember how many weeks/months that is).

    If you're white, the doctor will still tell you if you ask though.

    • by sayfawa (1099071) on Thursday April 26, 2012 @08:39PM (#39815077)
      Are you talking about Ontario? I think it's been up to the ultrasound practitioner's discretion, but in light of recent studies, some are advising their workers to not give out the information. I hadn't heard that it was illegal, thougth.

      It's a conundrum, though. If abortion is legal, it has to be legal for everyone, for all reasons. Perhaps more effort should be made to make sure certain immigrants know that around these here parts, we appreciate our daughters.

      But if it continues, well, it can't coninue for more than a generation or two. What's a sure-fire way to make sure your son abandons your sexist culture and marries someone from a different background who wont abort her female fetuses? Create a lack of women in your culture for them to date.
  • This is bullshit. (Score:5, Insightful)

    by bmo (77928) on Thursday April 26, 2012 @08:08PM (#39814685)

    All forms of scientific inquiry have "dual use"

    You may as well try to go back in time and stop Og or Urgh from figuring out how to make fire.

    Fuck this shit.

    --
    BMO

  • by JabberWokky (19442) <slashdot.com@timewarp.org> on Thursday April 26, 2012 @08:12PM (#39814723) Homepage Journal

    "Geoengineering: could lessen the effects of climate change or undermine the political will to fight it."

    Isn't this a bit like the whole "teaching condoms in school is dangerous because then teens will have massive amounts of sex"? You're omitting a valid (even if imperfect) solution that may help stave off tragedy if people choose a particular path in order to defend and mandate that your "morally superior path" is the only option presented.

    • by ultranova (717540) on Thursday April 26, 2012 @10:42PM (#39816339)

      Isn't this a bit like the whole "teaching condoms in school is dangerous because then teens will have massive amounts of sex"? You're omitting a valid (even if imperfect) solution that may help stave off tragedy if people choose a particular path in order to defend and mandate that your "morally superior path" is the only option presented.

      Well, one obvious difference is that condoms work and are available right now, while geoengineering is entirely hypothethical at this point. So condoms actually do solve the problems they're meant to - disease transmission and unwanted pregnancies - while geoengineering is simply an excuse to not do anything. So no, they're not really a tiniest bit similar situations.

      Not that global warming can be stopped at this point, since renewables are a joke and anti-nuclear hysteria has kept us from building clean power plants, so it's not like it matters much. It's gonna be interesting, seeing who'll still be standing when the dust settles.

  • Nothing... (Score:5, Insightful)

    by Solozerk (1003785) on Thursday April 26, 2012 @08:13PM (#39814733)
    Once you start blacklisting/limiting the release of scientific information, science is essentially dead. Science should be all about sharing of knowledge, collaborative work, cross confirmation of results. It's not scientists that should handle the 'risks' to society (taking into account ethics) - that's a job for politics (IE, you can publish how to make an atomic bomb but dissemination of nuclear material should be controlled by law). And in any case, any information you try to blacklist will eventually get out. Of course, I suppose there's a limit to that too - if we arrive at a point where a scientific discovery can lead to virtually anyone creating a WMD at low cost and with readily available materials, then there is a problem. But we're not there yet and anyway, at that point, there's no easy solution (though I personally believe a 'solution' should then be more along the lines of changing the root of the issue: why those people would want to create WMD to begin with).
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Exactly. If you outlaw research into weaponized virii then only criminals will have them and we won't even know how they work.

      • Well, more along the lines of other countries, and if necessary, hidden labs / boats in international waters. No Department of Defense is going to shutdown research into biological warfare, and with good reason -> they will probably need those results at some point, and it's the DoD's job to be (within reason, and then some) paranoid about national security. These are the people who have protocols, on the books, for every scenario they can think of, including, might I add, a chance meeting with extraterr

    • Re:Nothing... (Score:4, Insightful)

      by million_monkeys (2480792) on Thursday April 26, 2012 @08:34PM (#39815015)

      And in any case, any information you try to blacklist will eventually get out. Of course, I suppose there's a limit to that too - if we arrive at a point where a scientific discovery can lead to virtually anyone creating a WMD at low cost and with readily available materials, then there is a problem. But we're not there yet and anyway, at that point, there's no easy solution (though I personally believe a 'solution' should then be more along the lines of changing the root of the issue: why those people would want to create WMD to begin with).

      I think the key is making humanity's morality improve faster than the rate of scientific progression. If you don't do that, it's not going to end well.

    • Re:Nothing... (Score:5, Interesting)

      by Prune (557140) on Thursday April 26, 2012 @11:10PM (#39816571)
      I have a question for you, which may or may not be one from a devil's advocate standpoint (frankly, I haven't made up my mind yet). It's based on two trivial observations: 1) science and engineering are enablers of increasing reach of influence with decreasing effort, and 2) destruction is generally easier than creation and restraint. Having spelled them out explicitly, I think you know what I'm about to say is the obvious implication: technological progress over time allows an ever smaller group the ability to cause bigger death and destruction upon increasing areas and populations, with countermeasures and constraints lagging behind this ability (human history has been following this trend, where we went from massacring competing tribes to the ability to cause nuclear winter and kill most of the population). Taken to its logical limit, we are going towards the point where an individual will be able to destroy all of humanity (the specific method, be it "grey goo" or bioweapons or nuclear weapons or computer virus when we're all wired or have uploaded our minds into machines are details that don't affect this argument). The fundamental asymmetry of destructive power versus reactive protection schemes mean that even if many attempts are thwarted, eventually one is bound to succeed as time goes on. It seems to me that the ONLY way to deal with it is the most distasteful one--proactive countermeasures--constantly monitoring, privacy and anonymity nullifying pervasive surveillance (be it by people or machines, all the same) that know what everyone is doing at any time. I'm still waiting for a good counterargument, since I would LOVE it if there was a nicer alternative that would satisfy my warm feelings about freedom etc.
      • Re:Nothing... (Score:4, Insightful)

        by MDillenbeck (1739920) on Friday April 27, 2012 @01:14AM (#39817215)

        If technological advancement leads to greater and greater destructive powers, and destructive powers are much more easy to develop and implement than constructive powers, then how to do explain the human population explosion? It seems to me that the constructive sciences have far outstripped the destructive ones - at least, so far.

        I think destructive power is asymptotic, meaning that you can approach 100% destructiveness but never quite reach it. Remember, human populations have been pushed towards extremely low numbers in the past and we have continued to thrive as a species. In part, this is due to our adaptability as a species. In fact, I would argue that science has made us more resilient to seasonal variations and natural afflictions, but is also making us less resilient to rapid climate change and virulent strains that target monocrops or humans directly. However, even if a disaster strikes, I think there will be some humans who will survive - the question is would they thrive, or would we die off as a new dominant species out competes us.

  • support human rights in and of themselves, and take charge of the state actors that tend to use these things horribly.

    einstein and his friends were simply discussing the universe, and what would happen if you shined a light while riding on a superfast train. they had no 'intention' of investigating nuclear weapons, but that is where E=mc^2 eventually led.

    lets look at the computerized lists used to help perform the holocaust. they began as census taking machines.

    the attempt to cure disease was later used in

  • Could be used to observer history... or to change it.
  • This, or more generally, large-scale carbon fiber construction.

  • by im_thatoneguy (819432) on Thursday April 26, 2012 @08:21PM (#39814813)

    Sapient artificial species which don't die of natural causes and can live virtually will more radically threaten our culture, society and civilization than any other change in technology.

    For all of human history we've been adapting to the same species using different technology. We've never in history dealt with the fundamental nature of man changing before.

    Steal a baby from 2,000 BCE and it'll probably grow up like any other human. Steal a baby from 2,500 AD and it will most likely be a new species.

    • by LeDopore (898286)

      You say that like it's a bad thing.

    • Rather than happening all at once, it's more likely that the average human life span will gradually increase as medical technology improves in an iterative fashion. That's been the case for decades now and we're already having to deal with the consequences of an aging population. On the way to figuring out how to forestall indefinitely, we'll have to figure out how to forestall it for a really long time, which will entail making many of the same adjustments to society. These adjustments can also be made gra

  • In my opinion, the most dangerous science is always going to be physics. It is going to produce the most direct methods to destroy whatever you want destroyed. It is going to pose the most direct challenges to whatever dogma the aristocracy is using to part the pesants from their meager treasure. The application of physics can and may destroy the entire world.

    Everything else just make the danger slightly more efficient. Genetically engineered bird flu might be scary, but a few blankets with small pox

    • by Smallpond (221300)

      Yes. It s sort of like the Tower of Babel. No two people speaking exactly the same language and left with massive confusion.

  • by Spy Handler (822350) on Thursday April 26, 2012 @08:26PM (#39814895) Homepage Journal

    are banned in advanced technical civilizations, for good reasons.

    Suppose scientific experimentation confirms the existence of the soul, and that we all end up in Hell (or some very unpleasant equivalent), but the older you are when you die, the more painful it becomes? Or, that afterlife is extremely pleasant, better than anything you've ever experienced on earth, and the scientists build a machine that can give you a brief preview of this?

    That's right, mass suicides. The population of an entire planet disappeared this way.

    • by vistapwns (1103935) on Thursday April 26, 2012 @09:41PM (#39815729)
      Who needs a soul or magic? With nanobots and AI, someone could torture someone (or everyone), well, forever. What would people do if they knew that, and knew such technologies were coming soon? Perhaps this is the reason most people call the singularity a 'nerd rapture' and other things, there are very unpleasant possibilities inherent in a very technologically advanced universe and it's better if nobody acknowledge they're coming to keep people from panicing.
    • That's right, mass suicides. The population of an entire planet disappeared this way.

      If there truly is an afterlife, and it's pleasant if you die, suicide and death don't matter as much, do they?

  • by meerling (1487879) on Thursday April 26, 2012 @08:28PM (#39814917)
    There are a couple of things to be remembered.

    First: Everything man has ever created has been used for such negative things as murder and war. For that matter, every thing we ever will create will also be used for such things until such point as mankind has surpassed the need and desire for such negative activities.

    Second: Once a thing has been done, it will be done again. Once it is known by anyone that something is actually possible (as opposed to theoretically possible or even believed impossible) it becomes capable of being repeated. Just look at nuclear proliferation for an example. It was believed that splitting the atom was impossible. Once it was demonstrated to be possible, many others repeated the discovery despite the best attempts at others to prevent that from happening.

    The only thing they are really doing by blocking research from those in that field is to waste resources duplicating effort, and reducing or eliminating potential benefit from that knowledge while failing to prevent it's eventual and inevitable misuse. I would even hazard to say that such censorship increases the devastation that will be caused by such inhumane uses by limiting if not eliminating the positive research and understanding that comes from shared research and peer review.

    Only a moron, a paranoid, or a politician could come up with such a stupid and counterproductive scheme as censoring research.
  • I'm pretty sure there's a "political science" joke here somewhere, but I can't seem to make it work. Anybody else want to take a shot?

  • ... eugenics.

    Did I just manage to invoke Godwin's Law without using a certain historical name? (Never mind that said person didn't invent or implement it first.)

    • by jd (1658)

      Eugenics is widely practiced, even if we happen to call it "genetic screening", "genetic therapy" or "designer babies". You still end up deciding certain genetic lines should not exist. Forced sterilization is also practiced in many countries (including highly civilized ones).

      So the taboo is really only in discussing the ethics of such practices and where the lines should be drawn. It is extremely arguable that allowing a child to be born with a genetic disease that will likely be terminal in a relatively s

  • They could be quite a boon. They could give more women with reproductive problems the chance to have a genetic descendant child. You might even be able to correct dominant genetic problems (Huntington's Disease comes to mind) before implantation.

    They could also be terribly abused.

    Before, you had to convince/coerce a woman to get pregnant and carry a child to term. This put at least some practical limits (physical ones in the case of coercing, moral and persuasive ones in the case of convincing) on creating

    • by geekoid (135745)

      There is better technology coming on line the keeping a child alive until its' old enough to have useful organs.

      OTOH, you could create something without a brain, and then get a head transplant ever 20 or so years.

      I'm up for that!

      • by Hartree (191324)

        Yes, the idea of creating anencephalics for that purpose has been thought of.

        However it has complications. For many parts of the body to develop normally they have to be used. A digestive tract that has never processed food or muscles that have never moved are not going to be normal. You would have to have enough brain function to run those processes during growth. Or, alternatively, you would have to be able to interface a control system to the brain stem to take over that function.

        The Evil Overlord way to

    • by Gerafix (1028986)
      Your scenario is ridiculous. If you have sufficiently advanced technology to do that it would simply be easier to grow the organs themselves.
  • The Mad Science is the most dangerous form. Yes, the volcano fortress is cool, but eventually James Bond comes around and blows the place up.

  • by XiaoMing (1574363) on Thursday April 26, 2012 @08:51PM (#39815207)

    Pure and innocent Scientific Inquiry in the pursuit of knowledge generally hits a pretty thick wall pretty quickly as soon as it steps into the realm of things that already being researched, with the qualification that they are things the military is researching, or has researched within the past decade.

    Even now, just to use the results of certain types of this research -- such as very accurate nuclear interaction cross-sections (discovered for the purposes of nuclear weapons, but) used for the purposes of cancer treatment -- puts you under the watchful eye of the FBI.

    Yes, not everything falls under this category, and no, nobody needs to be reminded of the benefits of such research like how our microwave ovens defeated the germans, but just think about some of the examples we DO know about:
    WWII to Cold war era: Nuclear Science
    Cryptography (Government mandated PGP backdoor, anyone?)

    Sources:
    MCNP:
    http://mcnpx.lanl.gov/ [lanl.gov]
    PGP:
    http://books.google.com/books?id=cSe_0OnZqjAC&pg=PA352&lpg=PA352&dq=pgp+government+mandated+backdoor&source=bl&ots=cVtmm3vwYK&sig=fwjn6mfbXVWngTS0pgHIFWFV9bE&hl=en&sa=X&ei=5OyZT8_pLsXUgAf3gNX1DQ&ved=0CDoQ6AEwAg#v=onepage&q=pgp%20government%20mandated%20backdoor&f=false [google.com])

  • Because you know the only people who can get it will be Dick Cheneys.

  • by holophrastic (221104) on Thursday April 26, 2012 @08:57PM (#39815275)

    Each of those four items are the potential subject of nightmares and downfalls, but each and every one of them is a guarantee -- all eight.

    Imagine the year 2150. Distant to any human life, not at all distant to government, mediocre to construction (some city construction projects take 70 years), and eons to technology.

    In your 2150, can we spot genetic defects before birth? Of course. Can we select babies for the life that we want? As in can I choose the embryo with athletic skills over the embryo with mathematic skils? I'd sure hope so. It sounds dangerous today, but it's only dangerous in advance, like everything. By the time it's ubiquitous, it's just another form of choosing your child's academic goals. It just starts even earlier.

    Same goes for the other six in your 2150. I'd sure as hell hope that we can read minds to some extent by then. But just like the polygraph didn't destroy interrogations, and the mouse didn't destroy the keyboard, and television didn't kill radio, and the plane didn't kill the car, it won't be the only form of communication.

    As for police states reading minds, that's the ethical equivalent of humane execution. It's already a police state, it's already killing people, I'm not worried about the mind reading.

    Geoengineering is absolutely required in order to live anywhere but terrestrial land. Period. So it's guaranteed to happen. And it'll happen quite suddenly the day before it's required. And by the time it can be used to "undermind the political will to fight it" it'll be so easy to do that it'll be a part of normal construction.

    Nuclear weapons don't kill people. People's mistakes kill people. But people don't kill asteroids. Nuclear weapons kill asteroids. That's another period, by the way.

    I like how bird flu wasn't one of the top four, having inspired the thing in the first place. But that's the same concept. Of course we're going to have a major outbreak of something. We've had it before. Everyone's so worried that this time, with common means of global transportation, it'll be much worse. I think that they forget one thing. In probably under an hour, every airport and every border can instantly have screeners for whatever the current outbreak is. We have TSA and border and customs security everywhere nowadays. It'd be easy to suddenly, and globally, halt anyone displaying symptoms, or quickly test everyone as a part of transportation procedures.

    My point is that, as a civilization, we can't not have those things. Being scared of the research in advance is stupid. Focus on being scared of the initially flawed execution of that research. Work on that while the research is underway. We have M.A.D. for nuclear weapons. That's already worked a few times. It's dumb, but it worked. I'm stunned, but it worked. That's the sort of thing that we need for the rest of them. A Nash equillibrium for each one.

    • by Gerafix (1028986)
      To be honest I think you're extremely overestimating (to the point of ridiculousness) the effectiveness of "screeners" and the TSA and the rest of that bureaucracy.
  • by Beryllium Sphere(tm) (193358) on Thursday April 26, 2012 @09:02PM (#39815335) Homepage Journal

    There would be ethical and humanitarian applications for it, but mere death and pain would be hard pressed to compete with the potential damage of perfect propaganda. If some combination of psychology, hypnosis, drugs in the water, drugs in the drugs, or whatnot made it possible to get people to believe anything you said, that could be the end of all freedom forever.

  • Any kind of nanotechnology is in general bad news, because it'll be hard to control in the wild. Once you can make a lot of them, you can let them loose on a subject population and well, at least they'll wear out after a while.

    Because they're so small you pretty much need a trigger nanobot/signal to activate it ie: in the the presence of bot A bot B starts its thing, like disassembling RNA.

    There's not a lot you could do against these things, except stay out of the way. The good thing is that they probably w

  • This was a great episode. I highly recommend it, if you can find it see it.

    In it a student simply uncovers that fusion is a trivial to access. It is simply asymmetric. Power corrupts, what will happen if more power is easily available. As geeks we see that in the power given to us by computers such that kids can conceivably launch denial of service attacks.
  • Picture something like the Matrix, except..There is no possibility EVER of the "One", no escapees from the system, no resistance, no nothing just slaves in a system. Might look like this world, might look like something else. Assuming some mad man that controls nanobots/AI doesn't decide to say, kill all the men and take the women as slaves..*forever*. Pleasant dreams..
  • Aka figuring out the best text editor, whether it be vim, or notepad.exe.

  • by dbIII (701233) on Thursday April 26, 2012 @09:31PM (#39815603)
    The technological device that probably killed more people than anyone else in WWI was General Haig's telephone.
  • Planetary Motion (Score:4, Insightful)

    by Sloppy (14984) on Thursday April 26, 2012 @09:40PM (#39815711) Homepage Journal

    If people start studying how the planets move, it could lead to heresy yet also make sense, thereby undermining people's respect for authority.

  • by Smallpond (221300) on Thursday April 26, 2012 @09:52PM (#39815831) Homepage Journal

    Koran flammability
    Corporate misdeeds
    Police brutality

    Oh, you didn't mean dangerous to the researcher?

  • race and iq (Score:4, Insightful)

    by pigwiggle (882643) on Thursday April 26, 2012 @09:53PM (#39815839) Homepage

    Maybe not. First thing to pop into my head.

  • This is ridiculous (Score:4, Insightful)

    by Omnifarious (11933) * <`eric-slash' `at' `omnifarious.org'> on Thursday April 26, 2012 @10:05PM (#39815951) Homepage Journal

    Next we'll be wondering "Which are the most dangerous books to write?", or "What are the most dangerous sentences to say?". I reject the premise.

    If I were to pick at all, almost none of that would be on the list. Only things that had the potential to create society ending things that are not stoppable by individual action. Diseases, for example, fall into that category. But I find even that highly suspect.

  • by Joe_Dragon (2206452) on Thursday April 26, 2012 @10:05PM (#39815959)

    Don't let systems get to smart and don't hook them up to nukes.

  • by AHuxley (892839) on Thursday April 26, 2012 @10:58PM (#39816493) Homepage Journal
    http://www.dailymail.co.uk/news/article-2133201/Dr-Richard-Holmes-Suicide-riddle-weapons-expert-worked-David-Kelly.html [dailymail.co.uk]
    If your in the UK and working on chem, bio "protection" try not to get too stressed.
    It seems "suicide" is catching.....
  • by hyades1 (1149581) <hyades1@hotmail.com> on Thursday April 26, 2012 @11:10PM (#39816569)

    Free societies have always worked in part because when stupid laws are inevitably enacted, a lot of people ignore them with impunity. There has been freedom in anonymity. But face recognition technology is improving, surveillance cameras are proliferating, and other things like cell phones and debit cards make it trivially easy to see where people are and what they're doing. The only real safeguard of a free society, the inability of corporations and governments to deal with the vast sea of data, is coming to an end. And never mind actual laws. Kids who demonstrated against oil drilling in national parks when they were 13 will find themselves explaining to a job interviewer why they hate capitalism when they graduate from college.

    So my vote for major danger...at least to a free society...would be quantum computing as it affects D-base management.

  • by hazem (472289) on Thursday April 26, 2012 @11:53PM (#39816833) Journal

    Carl Sagan mentioned this in one of his books. The same technology that could be used to detect asteroids then reach them and divert their orbits away from the Earth could also be used to divert them towards the Earth.

  • by monk (1958) on Friday April 27, 2012 @01:45AM (#39817355) Homepage

    is advertising. Perfect persuasion trumps everything else.

Forty two.

Working...