Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Ethics in Scientific Research 278

call -151 writes: "There is an interesting NYT article `When Science Inadvertently Aids an Enemy' discussing how some of the "encryption should be free for everyone" attitudes are changing with the WTC attacks. The article makes some interesting points and it is good to see discussions like these in more of the mainstream, even if the tone has definitely changed recently." Well, the questions are being asked again, but most of the researchers dealing with these issues have already answered the questions for themselves.
This discussion has been archived. No new comments can be posted.

Ethics in Scientific Research

Comments Filter:
  • by ekrout ( 139379 ) on Tuesday September 25, 2001 @12:44PM (#2347505) Journal
    This is exactly why you may notice such rules prohibiting certain bit resolutions of encryption from being exported from the United States to other countries.

    On the surface it sounds reasonable, but in a day where a file can be transmitted between two different continents in real-time, I'm not sure those old-school rules are even helpful anymore.

    • Also if you look at where all work on crypto is being done nobody is really doing any serious work in the US now and have not been for a long time because of these rules. Yup pointless but that has not stopped them yet.
    • I think you mean to say "North America", not "United States".... Canadian network security is just as dependant on those encryption codes.

      Old school computer security methods just wont work in a world where everyone is interconnected... you can't stop exporting of encryption methods beyond your borders. What if an American company has an office in Africa, or Japan? Should they be forced to link over an unsecured network?

  • by Accipiter ( 8228 ) on Tuesday September 25, 2001 @12:44PM (#2347507)
    Right Here, Right Now [nytimes.com]. Enjoy.

    What the hell is this stupid postercomment compression filter?

    "Your comment must be THIS LONG to be posted to Slashdot."

    "You must be THIS TALL to ride this rollercoaster."

    Sheesh.
  • Why Login? [nytimes.com]

    While we're minimizing the threat of wrong-doing, lets not forget to shut down the razor factories, the internet cafes, rental car companies, etc..

    why not just stay in bed all day? Its probably safer than wandering out in the street and getting hit by an errant solar flare, a bad driver, etc..
      • iWhile we're minimizing the threat of wrong-doing, lets not forget to shut down the razor factories, the internet cafes, rental car companies, etc..

      Unfair. This article is really about future nanotech and how best to develop it. It's asking if we want to take another genie out of the bottle.

      Of course, it does make the point that sooner or later, somebody is going to break any moratorium, so (national pragmatism time) it might as well be us.

  • The phrase "Guns don't kill people, people kill people." I know it's a bad analogy but encryption is a tool. A few bad apples hopefully won't ruin the whole bunch. Please let your Govt. reps. know how you feel!
    • Re:It's kinda like.. (Score:4, Interesting)

      by jilles ( 20976 ) on Tuesday September 25, 2001 @01:19PM (#2347763) Homepage
      Encryption is rather different than a gun in a few respects:
      1) - the tools are software: duplication is easy. Guns are hardware and sophisticated knowledge is needed to make them.
      2) - the algorithms are well known: you can make your own tools (without the backdoors). Building your own guns is a bit harder (though not impossible)
      3) - there are open source tools (you don't even have to go through step 2 to obtain tools free from backdoors). Although the US occasionally hands out guns (e.g. stinger missiles to the afghan resistance a.k.a. taliban in the eighties), in general selling arms is profitable business.

      Now about guns: you need a gun + an idiot to pull the trigger to kill people. Both prerequisites are available in large quantities in the US. In western europe, guns are a bit harder to get so we have less casualties as the result of guns (check the statistics if you like). Obviously, removing guns from society helps reduce the amount of people dying from guns. Doing so is a problem in the US however since billions of guns have been sold there in the recent centuries. So if you are in the US you are fucked, people around you are nuts and have easy access to guns. One day your nice neighbour or colleague may have a bad day and pull his guns on you (which he can buy legally and keep in his house).

      Now lets turn to the real issue: why is the US pushing backdoors in encryption software: industrial espionage. Being able to tap in on information banks and businesses exchange throughout the world is very profitable business. A terrorist will just use illegal/free tools (probably on a illegal version of win XP or whatever). If there's one thing you can be sure of: terrorists don't like the US and they are not bloody likely to stimulate the US economy by actually paying for software produced in the US. What do you think? Bin Laden will actually log on to MSN and chat with his colleagues??? Come on!

      The US government is using this situation to rearrange the world to make it a little bit more comfortable for the US leaders. Aguably the WTC tragedy was the best thing to happen to them in years. Some impopular anti-terrorist/anti-human rights laws can be pushed through. Suddenly they can be friends with Pakistan (a few weeks ago still referred to as a rogue state that we should be protected from by a missile shield). Everybody turns a blind eye while they whipe the Taliban of the earth and even Khatami is suddenly being friendly on behalf of Iran. In addition some former Soviet republics who happen to play an important role in producing and transporting oil are also the US' best friends.

      It is touching to see all this friendship bloom. Unfortunately it is at the cost of millions of innocent Afghan civilians, already in big trouble because of the previous civil wars. What happened to New York was bad but the opportunistic way the US government is dealing with the situation is sickening.
  • by Johnny5000 ( 451029 ) on Tuesday September 25, 2001 @12:45PM (#2347517) Homepage Journal
    Unbreakable codes are a tool.

    A tool is not evil. A tool by itself can't fly an airplane into a crowded building.

    It depends on the use of the tool.

    Evil people will do evil things with it, good people will do good things with it.

    -J5K

  • by Rackemup ( 160230 ) on Tuesday September 25, 2001 @12:45PM (#2347518) Homepage
    Cars... knives ... even nail clippers can be misused, it depends on the person operating the equipment. Most airlines have gotten rid of metal knifes on planes now, preventing them from being used in an attack but also punishing those people who just want to butter their roll.

    It's good that (some) people starting to use their heads when it comes to security, but restricting the use of an item because of what it "might" be used for is a little overboard. Eventually everyone will be in a facial recognition system, fingerprinted, dna sequenced, and blood typed in a huge federal database JUST IN CASE you ever do something wrong.

    Where's the line?

    • I'm sorry Sir, but in order to board this plane you will have to remove your hands. You see, they can be turned into "fists" and that simply won't do.
  • by Si ( 9816 ) on Tuesday September 25, 2001 @12:46PM (#2347522) Homepage
    Scientists should not hold back news of a discovery for fear that one day it may be used by the bad guys -- let the sociologists deal with that. All scientific discoveries have the potential to uplift the human condition. Perhaps one day we will no longer have a need to strong crypto, but until then Hellmann and others should not feel ashamed or guilty about their discoveries and contributions. The ones who should feel ashamed are those who let their personal agendas get in the way of progress, who would rather see us back in age where the privileged few have all the power and the masses are huddled together in the dark looking to superstition for salvation.
    • The ones who should feel ashamed are those who let their personal agendas get in the way of progress, who would rather see us back in age where the privileged few have all the power and the masses are huddled together in the dark looking to superstition for salvation.

      Urr.. Isn't this exactly where the new "Corporate Nations of Legislation" are trying to take us?
      Superstition, in this case, being the belief in the marketing spiel that "It'll all be alright once big company 'a' controls everything in here, and makes it work right in the next version, which will honestly be cheaper and more efficient, and do everything you ever wanted in life".

      That aside, I wholeheartedly agree with the sentiment you express.
  • by drfrog ( 145882 ) on Tuesday September 25, 2001 @12:46PM (#2347523) Homepage
    new eula for a hammer:

    the end user will not use this hammer
    to build anything that would be deemed
    uncapitalistic or un democractic!
    this would include:
    Mosques,
    Churches and
    Socialist Gathering centers

    yes
    what im getting at is encryption is a TOOL

    if the terrorists we stupid enough to use a
    publicly accessible encryption methods
    instead of creating an 'in-house' solution
    they are just asking for it!

    {
    IF it was bin Laden
    dont you think he could
    afford better encryption?
    come on!
    }

    asking everyone else to
    throw away freedom for more
    security is not an option

    in fact it plays into their hands!

    • I completely agree. Throwing away tools because they could potentially be used for evil is stupid. And don't argue about how they make evil on a grander scale so much easier - Hitler managed to kill millions of innocents with good old FIRE! OOOoohh! What a wonderfully horrendous technology! Let's ban it too!!!

      C'mon.

  • by kingpin2k ( 523489 ) on Tuesday September 25, 2001 @12:46PM (#2347530)
    Encryption, as an algorithm for crunching numbers, costs nothing. You can't keep it out of the hands of the bad guys simply by keeping it out of the hands of the good guys.
    • Okay, so maybe Adelman and Hellman could have agreed to classify their codes back in the seventies. Um, 25 years later, just precisely how does anyone think that this knowledge will be suddenly erased from the world? When no reasonable user or developer would ever give it up? All of this talk about unintended consequences is interesting and all, but any discussion of stopping it now is sheer folly.
  • Tools and people (Score:4, Insightful)

    by yali ( 209015 ) on Tuesday September 25, 2001 @12:46PM (#2347531)

    To those who say "tools are just tools, it's people that are good or bad," I'd like to pose this question. (This isn't just rhetorical, I'm really curious what people think.) Isn't it the responsibility of those who create or disseminate tools to understand the context into which they release them?

    By analogy, if I give a gun to a criminal, some people would hold me partially accountable for what the criminal does with it, especially if I knew (or should have known) that this was a criminal. If I give a gun to a kid, I'm responsible for evaluating whether the kid's ready to learn about guns, and if so, to teach the kid about safety, etc.

    Does the analogy extend to scientists? Do they have some responsibility to take part in social, political, etc. processes to ensure that the world they release their tools into is ready and capable of making ethical and moral use of them? If so, what are the minimum requirements and limits of this responsibility?

    • Equating guns, a physical object, with encryption, a mathematical exercise, misses the point a bit. Encryption can't directly kill anyone no matter how it's used.
    • Re:Tools and people (Score:5, Interesting)

      by Bonker ( 243350 ) on Tuesday September 25, 2001 @01:00PM (#2347626)
      Does the analogy extend to scientists? Do they have some responsibility to take part in social, political, etc. processes to ensure that the world they release their tools into is ready and capable of making ethical and moral use of them? If so, what are the minimum requirements and limits of this responsibility?

      In the 40's, scientists in the United States, Germany, and Russia were all very rapidly untangling the secrets of nuclear fission, nominally for use in weapons.

      Many of the scientists have since decried their own work, but the fact remains that this 'weapons' technology and the research that lead to it has given rise to a goodly proportion of the technology we use today in the modern world.

      While saftey questions, many of which are unfounded, still abound, its apparent that fission energy will be the cheapest, safest, and and cleanest energy that mankind can harness until solar collectors are dramatically improved, or fusion energy passes 'breakeven' levels on a sustained basis.

      Most of the computer technology we use ultimately arises from the work of men who's research also led to military uses and was used in the construction of atomic weapons.

      The upcoming generation of quantum computing relies on theories that are even more closley tied to nuclear fission.

      Most scientists don't think in terms of 'how can I create a better, more deadly weapon'. They think in terms of unlocking the secrets of the universe. These actions, just like any other actions, have positive and negative consequences.

      You wouldn't know the good, if not for the bad.
    • by MarkusQ ( 450076 ) on Tuesday September 25, 2001 @01:08PM (#2347684) Journal
      By analogy, if I give a gun to a criminal, some people would hold me partially accountable for what the criminal does with it, especially if I knew (or should have known) that this was a criminal. If I give a gun to a kid, I'm responsible for evaluating whether the kid's ready to learn about guns, and if so, to teach the kid about safety, etc.

      Does the analogy extend to scientists?

      The analogy does not hold. This is clear once you realize that science is a process of discovery, not of creation. A scientist is more like an explorer, discovering facts that were true long before they were discovered, facts that would eventually have been discovered by someone, facts that affect everyone, even the people who don't know them.

      Newton should not be blamed for all the people who die from falling, just because he discovered the law of universal gravitation. Nor should he be blamed for ballistic missiles, which rely on his law for their operation.

      A better analogy would be:

      Should explorers who discover old mine fields or dangerous animals publish the fact? Or should they let others that follow blindly wander into the danger, unruffled but no safer in their ignorance.

      -- MarkusQ

    • OK, I said it before and I'm going to say it again. The primary weapon was not crypto, it was an airplane. It's just as stupid to blame the airplane and it's inventors [slashdot.org] as it is to blame crypto and it's inventors.

      Crypto has many legitmate uses, comercial, governmental and individual. Banks use it for transactions. Governemnts and individuals can use it to keep things to themselves. Why do people think this is so evil? Do people hate envelopes? Curtians, walls and Clothes?

  • US colleges and Universities teach thousands of foreign students. What is to stop someone from getting a degree in mathematics at MIT, going back to his home country and creating a strong encryption program?
  • by JohnnyX ( 11429 ) on Tuesday September 25, 2001 @12:51PM (#2347559) Homepage Journal
    Then you may be interested in Americans for the Preservation of Information Security [preservesecurity.org], a group working to keep ill-advised legislation from being passed that would deny us tools to keep our information safe in the hopes of denying them to terrorists as well.

    Yours truly,
    Mr. X

    ...do something...
  • by Jace of Fuse! ( 72042 ) on Tuesday September 25, 2001 @12:51PM (#2347560) Homepage
    There's nothing stopping a small group of interelated individuals from writing their own scrambling technique which could qualify as "encryption", and if laws were passed requiring "back doors" or what-have-you, then any old "Little Orphan Annie Decoder Wheel" that the Government couldn't figure out would instantly make sensitive information (and the people who deal in it) illegal/criminals.

    I'll cite an theoretical example.

    Video Game Company X has a neat little game gaining great popularity, but due to various reasons they encrypt certain game data with proprietary methods, not at all to keep the government out, but to keep cheaters from snooping the data and exploiting the game. For the sake of argument, they use a clever, light-weight encryption scheme that nobody seems to be able to figure out and for which no back-door-method can feasibly be devised. After all, this is a game, not a spy communications device.

    Since we know that they're doing it for gaming, and not espionage, we can consider it mostly harmless. But the laws some people want to pass would probably prohibit this very thing. And for what? Supposed terroist threat? Get real.

    I don't even know why I'm rambling about this consider almost everyone here is likely going to agree with me that the trivial uses of encryption should be inalienable in one's rights to privacy. But I'm just frightened that someone might do something (such as the above example) and suddenly find themselves locked away for life just because they wanted a secure entertainment platform.

    Lock up the clowns?
    • Ok, so terrorists will log onto Q3 servers to carry on secure communications... :)

      As if the Columbine lawsuit wasn't silly enough.

  • The main problem... (Score:5, Interesting)

    by xonker ( 29382 ) on Tuesday September 25, 2001 @12:52PM (#2347561) Homepage Journal
    Is that our society is so ethically-challenged and bereft of common sense that we have to make any undesirable behavior illegal, and any desirable behavior mandatory. (Seatbelts and motorcycle helmets, for instance.)

    To many people it makes sense to make anything potentially harmful illegal, because how else would we discourage it?

    We've gotten so used to our morality being legislated that we feel we have to pass laws for everything. That's why the abortion issue is such a big deal, because people equate morality with legality. The same deal with sexual harassment laws. We shouldn't need laws to tell us that sexual harassment is wrong, but without the threat of legal penalties many people would still be pinching their secretary on the ass every time they walked in the room or worse.

    So, basically, because someone somewhere might use encryption for evil, and because the average voter doesn't have a clue what it's for, they have no problem with it being made illegal to prevent (in their mind) possible abuses.
    • ...because people equate morality with legality.


      And what is wrong with this? Laws are ideally a representation of the values that a society has at large, things that a majority of us agree on. These values are based directly on our morals. The debates arise when we have a disagreement- you brought up that abortion is much debated- what is not debated is that it's against the law to murder someone.


      This is because we all agree it's wrong kill someone, republican, democrat, liberal, independant, green, labor- it's morally wrong, so we have declared it illegal to give us a mechanism to punish those in our society who have morals the majority of us would find apalling.


      That being said, I do believe we are a bit over- legislated.

      • by kcbrown ( 7426 )
        No. Not here in the United States, at least.

        The purpose of the U.S. is to provide a safe place for the people where their liberties are maximized (see the preamble to the U.S. Constitution if you don't believe me). Thus, the purpose of the law in the U.S. is to more clearly define where one person's rights and liberties end and another's rights and liberties begin. It is for this reason that it has been traditionally viewed that laws in the U.S. should be crafted to have as little impact as possible, to restrict the people as little as possible.

        This is clearly not how things are in the U.S. today, and that needs to be fixed (can't see how to do it, though, since the government is 0wn3d by the corporations). But in any case, if the purpose of the law is as I state, then morality legislation has no place in U.S. law, because a law is, in the general case, a restriction on a person's freedom. One may use "morality" to help define the boundary between one person's rights and liberties and another's, but pure "morality legislation", i.e. making something illegal simply because the society believes it should be, has no place in a society that values liberty above all else.

      • And what is wrong with this? Laws are ideally a representation of the values that a society has at large, things that a majority of us agree on. These values are based directly on our morals.

        You can't abdicate moral responsibility like that. It was established that "only following orders" is not a legal defence, in war crimes trials. The problem is that many people have taken the attitude that if it's not explicitly illegal, it must be OK, and that's why we have thousands of petty laws passed every year.

        Besides, a conscience is a lot easier to carry around than a library of case law, even on CD ROM :0) Do unto others as you'd have them do unto you, it's as simple as that.

    • ... people equate morality with legality.

      Yeah, they do. Because there's not ONE definition of what's moral and what isn't -- it's a cultural thing. I think we'd all agree to say that, for example, fscking sheep isn't moral. And yet, in some civilisations, it was tolerated. You can call it barbaric or whatever -- it was still not immoral by their standards.
      I think if people tend to want to turn morals into laws, it's probably so that they can force their moral (say, it's not moral to kill people in a video game) on everybody else.

      Whether that's a good thing is left as an exercise to the reader.
  • by MarkusQ ( 450076 ) on Tuesday September 25, 2001 @12:52PM (#2347562) Journal
    ...Poll Says Most Americans Favor Crypto Backdoors...

    [NYT]...discussing how some of the "encryption should be free for everyone" attitudes are changing with the WTC attacks...

    It doesn't matter what polls say, or how people's attitudes change; the fundamental issue is that crypto-backdoors, laws against strong crypto, etc. etc. are doomed to failure because they won't work.

    This is not to say that such laws might not get passed, causing untold inconvenience to law-abiding citizens, chilling research, and compromising our national security by giving crackers a weak point to attack; all I'm saying is that such laws mathematically can't serve their purported purpose.

    That is the message that needs to get out.

    -- MarkusQ

      • the fundamental issue is that crypto-backdoors, laws against strong crypto, etc. etc. are doomed to failure because they won't work.

      Er, yes, as the article says: "It probably is too late to take back cryptography even if people wanted to, experts say"

      The "probably" is temporising though. It's way too late.

      Anyway, bear in mind that the interviews behind this article are really just leveraging the view (that matches ours) that nanotechnology is coming, and if we don't do it, someone else will, so we'd best get our navel gazing out of the way quickly. The encryption angle is just a story hook.

  • I don't think making encryption illegal (or more difficult) would of had any effect in the WTC disaster.

    The simple fact is, many of these suspects are/were anonymous in the fact that they blend in with the crowd, and there was no reason to suspect many of them of planning such an act beforehand.

    Unless the government screens every single IM/email sent, by everyone (not feasible considering legal/search warrant issues, not to mention the manpower involved) there is no way to protect people from things like this being planned.

    Sure, you can track the online activities of known terrrorists, but for every one you know about, there likely exists hundreds you don't. I think the real threat lies in the terrorists that we have no knowledge of, as of yet, and have no reason to issue search warrants against.

    • Not to mention the fact that terrorists may not be using electronic encryption, but also other types of code.

      i.e.
      "We'll meet for coffee thursday at two at Starbucks."

      equals

      "We attack insert landmark here Wednesday at one."

  • by smoondog ( 85133 ) on Tuesday September 25, 2001 @12:53PM (#2347572)
    As a scientist let me say I understand the concerns of society. I wish that some software developers would realize that as our society becomes more digitized, the power of programming becomes greater.

    Consider this. In the '40's a few great men/women created an awesome force with grave consequences, the nuclear bomb. A computer security scientist would never consider himself on this level of creation of power, nor should he. But what if a programmer develops a worm that destroys information perfectly, there by bring down an economy, possibly killing people? To go even farther, what if someone creates the technology that enables a terrorist attack, or enables that worm to exist?

    As we go farther into the digital age, programming is going to have more and more power and influence. Imagine if physicists were to take the arrogant attitude of today's security developers and say, "If I can build it, I should and also tell everyone else how to do it!"

    I just think that in some cases, we should really consider the consequences of our actions....

    -Sean
    • Re:As a scientist.. (Score:4, Informative)

      by Rogerborg ( 306625 ) on Tuesday September 25, 2001 @01:13PM (#2347721) Homepage
      • Imagine if physicists were to take the arrogant attitude of today's security developers and say, "If I can build it, I should and also tell everyone else how to do it!"

      You mean like Niels Bohr [idsa-india.org] and others did? This reference took me about 10 seconds to find, please don't insult us by re-writing history to suit an argument.

      That aside, I do actually agree with your point that inventors (and manufacturers) share in the moral burden of technologies.

      On the other hand, pragmatically, if we don't do it, someone else really will.

      On balance, I find myself agreeing with the NYT article's conclusion that it's a bitch of a decision and we need to find a thick skinned bastard to make it for all of us navel gazing pussies. (OK, I'm paraphrasing slightly...)

    • If such a tool is ever brought about, it's likely to be the result of years of research and development.
      And most likely, paid for by one government or another.
      The chances are, however, that such a use will be a twist on some other technology that's actually designed to advance science, and benefit humankind to a huge degree. It always seems that when a huge now technology comes around, either it was developed by the military, or they very rapidly find a way to use it.
      I don't think it's the crackers you need to worry about, thinking this up, it's the researchers in government labs. Any government.

      Malk
    • Re:As a scientist.. (Score:2, Interesting)

      by Telastyn ( 206146 )
      The one fault of this of course is that the atomic bomb always works. Drop it somewhere, and wham, you blow stuff up.

      Virii and worms, and even encryption require mediums in which to work. They only work because the medium they are in allow such things.

      Virii and worms should always be discussed openly as this is the best way to defeat them. Encryption should always be discussed because that is the best way to ensure the cryptosystem.

      The nuclear bomb cannot be defeated by open information, only by human conscience.
    • by NOC_Monkey ( 73018 )
      But what if a programmer develops a worm that destroys information perfectly, there by bring down an economy, possibly killing people? To go even farther, what if someone creates the technology that enables a terrorist attack, or enables that worm to exist?

      Well, considering that computers are the technology that allow that worm to exist, Do you suggest we ban them? But computers use a lot of refined metals in their manufacture - and so do guns, knives, bullets, and swords. Should we ban metalworking, because it is "the technology that enables a terrorist attack"?

      Granted, this is an argument from extremes. However, when I think of a physicist choosing to hide their findings, no matter how revolutionary, no matter the potential for increasing our understanding of the universe, I am reminded of the Catholic Church in the middle ages, where commoners were not allowed to read the Bible because they might not understand it, thus creating heretics. Everything can be used for both good and evil purposes. There is no such thing as a "purely benign" invention or discovery. Physics - nuclear power/nuclear weapons. Electronics - Pacemaker controllers/missle guidance systems. Biology - Vaccines/Viral warfare. Steelworking - Building support beams/Tanks. Blacksmithing - Ploughs/Swords.

      There are two sides to every coin

    • by _Mustang ( 96904 )
      Imagine if physicists were to take the arrogant attitude of today's security developers and say, "If I can build it, I should and also tell everyone else how to do it!"

      And what exactly would you consider the supercollider (and more exotic) infrastructure in use by educational instititions then, if not exactly that.
      I mean really, smashing atoms and (trying to create) black holes certainly seem to meet the criteria for arrogance. And it has to be those physicists you mentioned since no one else has the expertise to even dream that stuff up much less implement it technologically. But to "tell everyone else how to do it" is exactly what peer-review is all about; the idea is a very fundamental one for safe science, no?

      But - would I put a stop to it if I could? No, because the potential benefits to humanity and me personally far outweigh the inherent dangers. I would like to see more control on these types of *things* but definately do not wish an end to them. As someone who considers himself a hacker of the classic definition - I wholeheartedly believe in that old adage. Paraphrasing I think it went something like "you go to school for an education, but to learn you need first-hand experience".
    • by Znork ( 31774 )
      You're making the mistake of thinking the old way, the same as the patent system. Scientist are no longer unique, nor do they do anything unique. Knowledge and research is by now so widely done and so distributed that invention has become an iterative process where there is no longer any question about if something will be solved or invented, but rather who, out of hundreds of scientists and teams, will do it first.

      The research and dispersal of information is by now inevitable. Go ahead, keep it silent, and read all about it next month when someone else goes public with it instead.
      • You seem to have a strange way of looking at the scientific process. Most things that scientists do are unique on at least some level. The fact that there may be other people capable of doing the same work doesn't change that. Only small parts of science are really iterative, and in many of those cases, the work is still important for the purposes of gathering data, and each of those data points is unique. Not only that, you seem to be under the impression that science has become, in some sense, predictable. There are a great many discoveries waiting to be made and problems to be solved that no-one has ever even concieved of, yet you seem to think that it's only a matter of time before they are made?? If what scientists were doing isn't unique, who else is doing it? Who else but scientists are doing experiments in molecular biology, and also have the qualifications to do so? Who else but scientists continue to probe the fabric of space-time and have the qualifications to do so? etc, etc, the list goes on. The first time an amazing new theory of physics is discovered by some bloak using the google search engine, I'll accept your statements...until then, you either don't seem to know what you're talking about or haven't adequately explained your position.
        • Well, research and advances are actually being made and published, perhaps not using google, but various abstract search engines. Comparing and expanding on other researchers data is a time-honored way of doing research.

          But that isnt really the point. The point is that much broad science today is a competition. Take for example the human genome. No question about wether it will be mapped or not, but who would do it first. The knowledge about genome mapping is spread so widely, through the mapping of various other creatures DNA, that the number of people who could do it would range in thousands or maybe tens of thousands, and the number of people who could learn to do it would range in millions or billions. Same with physics; who will be able to prove the existence of various particles first? Same with computer science, medicine, mathematics, etc. It isnt a coincidence that there are many different types of virtually unbreakable encryption types today. It isnt a coincidence that several medical companies launch medicines for the same things within a few months or years of eachother. Knowledge is spread to orders of magnitudes more people today, and while it may take five smart guys a year more that it would take a real genious to leap to a conclusion, it will be done.

          The point being: As a scientist, it doesnt matter if you try to keep silent about something because you fear the consequences. The discovery will be made, wether you keep silent or not.
    • That doesn't mean that the pursuit or release of knowledge should be restricted in any way.

      As a scientist I am more concerned with what other scientists are *doing* that with what they are *developing*. Our colleagues who developed the techniques to clone DNA into plant cells (a number of whom I know personally) did nothing wrong, and should not have delayed publication because of the "ethical" consideration of what someone else could do with it. The people who are genetically altering corn to make it increasingly resistant to chlorinated organics [greenpeace.org] (roundup) are *doing* something unethical; and they are the ones, largely highly intelligent people, whom we need to reach and educate. Some of the things I'm attempting to do could have direct, terrible applications in germ warfare - but they could also be a great boon to medical research. The resolution of that dillemma is clear: we cannot call a halt to scientific progress because of fear.

      Other scientists, and some people may draw an increasingly meaningless distinction and call them engineers, are actually applying these developments to do things that shouldn't be done. Biopreparat [fas.org] doesn't exist anymore, but I'm sure biological weapons research continues. The people who nerve gassed the Tokyo subway where highly educated. These people are doing more damage with their own scientific expertise than laymen ever can, or will, with something you release.

      Ethics requirements at graduate schools should be specific, factual and tailored to the particular focus of the student. Individuals who want to go into plant genetics should take courses in the political economics of third world agriculture - the same ones that pol sci students take. Courses in "ethics" are substanceless exercises in sophistry (say that 10 times fast) that don't teach the consequences of the particular actions a student might actually take.

      While relatively uneducated terrorists can make certain uses of publically released technologies like culturing eukaryotic cells or near unbreakable encryption, the *real* danger, and it is a real danger, is when the scientists ourselves are actually setting out to do harm; or applying these technologies in ignorance for our own economic gain.

    • I do. That is why I wish to make as much as possible open to everyone. I don't want any special power brokers. There is also simply no conceivable way to prejudge all possible applications of some basic bit of software technology. We should use our power wisely to increase the freedom and possible benefits to all and refuse to use our power to imprison others for the benefit of the few.

      It is not arrogant to vote for freedom and distributing the power as broadly as possible.
  • by Tackhead ( 54550 ) on Tuesday September 25, 2001 @12:55PM (#2347589)
    1) We saw what the media did to Zimmerman, trying to portray him as torn up over PGP. He's not.

    2) What if this is more of the same?

    But on to original point - while Hellman admits his view of NSA as "Darth Vader" was "human but ... ridiculous" - perhaps he's overlooking the number of people whose lives were saved by strong crypto?

    Or perhaps there's nobody in Tibet resisting the Chinese? Or perhaps there was nobody in the former Soviet Union using crypto during the coup? Or perhaps the Berlin Wall came down, in part, because people were able to communicate without Stasi eavesdropping on them.

    Or perhaps the women who infiltrated Afghanistan in defence of native women being slaughtered by the Taliban were only able to get their stories out -- stories that have been publicized time and again over the past few years, and that have nothing to do with the present crisis -- because they're able to communicate securely.

    If (and in light of the Zimmerman distortions, I see it as a very big "If") Hellman is having second thoughts about public-key crypto, I urge him to look at the good it's brought.

    NYC was One Big Atrocity. We'll never know how many Little Personal Attrocities Hellman's tech has prevented, but I'd bet it's in the thousands.

  • RSA encryption is basically this: raising one large number to the power of another modulo a third number. That's all there is to it. You can implement RSA in one line of Mathematica code.


    To me it's very scary that this might be outlawed. Will we have to outlaw multiplcation modulo an integer? Maybe we'll have to set a limit on how large the numbers will be so that you're only allowed to do this with small numbers. Maybe you'd be allowed to use large numbers if you're a licensed mathematical researcher. Maybe you'd be allowed to multiply large numbers but not be allowed to send the results by email. But well encrypted data is indistinguishable from noise. Does that mean we need to make it illegal to send noise by email? But of course we can smuggle noise-like data through techniques like steganography - eg. by hiding data in the low order bits of images. Does that mean we would no longer be able to send noisy images by email - we'd have to filter them nice and smoothly first.


    Where does it end?

      • To me it's very scary that [a mathematical operation] might be outlawed

      Been keeping up to date with the DMCA? There's a couple of binary implementations of DeCSS out there which have been expressed as prime numbers. Does possessiong that number become illegal? If not, why not? Is it only illegal when it's on a computer that can run it as a DeCSS exe, or is it illegal on any computer, or is it illegal to even write down on a napkin? If not, why should it be illegal to write DeCSS source on that napkin?

      We don't need any new laws to scare the crap out of us. The ones we have are quite nasty enough, thanks very much.

  • One Time Pad (Score:2, Insightful)

    by cs668 ( 89484 )
    This is sort of silly. It is not like terrorist organizations need high tech crypto to be secure.

    They already have ways of diseminating information and money. If they want great security all they will have to do is create a CD with randomness on it and distribute it among their cells.

    I'm sure even the lamest coder could write code to do an XOR against a part of that CD and it would be incredibally hard to decode.
  • Let's see that report.

    Like, on how extremist politicians foment the fires of war with their 'moral' agenda's to 'counter the scourge'.

    It's not scientists that create wars, or manufacture enemies. It's politicians like George Bush and Osama Bin Laden (yes, he's a politician in every sense of the word).

    Scientists are just straw men, held up to burn, when a little push is needed to foment peoples ire...
  • by Greyfox ( 87712 ) on Tuesday September 25, 2001 @01:12PM (#2347713) Homepage Journal
    Wait a sec. Name ONE human invention that can't. Everything from the pointed stick on up pretty much qualifies. So should we stop making everything because someone might use it for evil? Even your penecillin might accidentally save a future Hitler. Shall we throw away that technology because of that?

    I got news for you. You can't live a completely safe life. There is always the chance that something or someone will kill you, no matter how bizarre the circumstances. So you propose to live your life paralyzed by fear, never making progress because progress could be dangerous. Talk about cowardice. And the US will not remain a technological leader for long with that attitude.

    I put myself in situations where I could die (Arguably 10 times a week as I commute to and from work) because I refuse to live in fear. I enjoy hang gliding and hiking the short (2-3 mile) trails at the Rocky Mountain National Park even though doing so is putting my life at risk. Sure I could crash. Sure I could run into a bear or a mountain lion that would think I'd make a tasty and delicious snack. I see dozens of people each trip up to the park who never even think you could die up there. Every year a few idiots get gored by pissed off elk. Fucking Disneyland Mentality. People die in the amusement park. No place is completely safe. You will never make anyplace completely safe. You will eventually die, one way or another. Deal with it.

    • Some human inventions involve power (HBombs) or mechanisms (designer viruses) that can have catastrophic consequences. One evil act (or unthinking act or unknowing act) and you have the end of the world as we know it (or too close to it for comfort, e.g., Cuban missle crisis).

      At the moment, we have been able to prevent such acts (e.g., bin Laden getting 100 H-Bombs), but prevention is much harder than post mortem.

      No, I don't think evil people using encryption is a world-ending catastrophe. Anyone who thinks is just plain silly. Using encryption by itself doesn't harm anybody. Other technologies are different, though.

      • SOMEONE was going to invent the H Bomb and better us first than Nazi Germany first. We didn't stop to think of the ethics involved because we knew that our enemies wouldn't. As for designer virusses, again technology won't stand still if we do so. Our best bet is to know more about the technology in general than anyone else. Then when someone else whips one up and releases it, we'll be able to counter it, hopefully before millions die as a result. Many of the same questions will arise with nanotech, too, and many of the same answers will, too.
  • by aphor ( 99965 ) on Tuesday September 25, 2001 @01:12PM (#2347717) Journal

    This issue was already explored by the Internet community, and the cypherpunk manifesto From Crossbows to Cryptography [berkeley.edu] explains the issue, though some of us find our collective selves on the other side of the coin from the cypherpunks this time.

    The issue is power, which privacy confers because anonymity is impunity. Authorship being one of the critical facts concealed by any encrypted parcel. Technology originates in the powerful, in order to confer more power to them. However the technology itself is information which escapes by multiplying itself in unacquainted minds, eventually in those minds outside the power elite which devised the technology. The balance of power falls back to somewhere between the power elite and the subject people.

    Now all of this exists independant of ethics. No doubt the power elite would like the subjects to restrain their use of the technology on a principle that does not bind the power elite. Ethics are weak (subjective and voluntary), but they are at least sometimes effective.

    Where this leads us is to the question: should we develop new encryption technology? Should we implement Key Escrow? I urge you to think long and hard about the cold facts of how any of those possibilities can be abused. Experts agree that without strong cryptography (even for terrorists) democracy will fail. This is a new world and requires acute wisdom to set the direction we move next. Freedom of speech is not an option or a priviledge, it is a right whithout which people cannot guarantee governance by consent.

  • Legitimate Fears (Score:2, Insightful)

    by eAndroid ( 71215 )
    The people making these laws aren't just trying to maintian heavy handed control over the American poeple (at least not all of them). They have legitimate fears seeded in truth and actual occurances. However they are taking the wrong perspective.

    During WW2 the Germans had then very strong encryption with the Enigma machines. The Allies were only able to get a hold of these machines later on in the war, and by several accounts were crucial to the German's defeat.

    Our law makers fear another enemy rising and using impossible to crack encryption, and this time using it to win. Most of us here on Slashdot however realise the obvious flaws in this logic. Just as when WWI transformed war into a battle faught with trenches the security of communication on each side will too change the face of war. We would be better to adapt to the changes in war than to expect everyone to play by our rules.
  • Just Imagine. (Score:4, Insightful)

    by atathert ( 127489 ) on Tuesday September 25, 2001 @01:17PM (#2347757)
    The year is 1903, the location: Kitty Hawk, North Carolina and YOU ARE THERE!


    We see the Wright Brothers standing near the first ever airplane, moments before it takes off for the first ever powered flight. As they begin to board the craft, a reporter informs them that their invention will be used to kill thousands of people, destroy a building, and drastically alter the fabric of the nation that they love so much. They also are told about the untold number of deaths caused by warplanes, including dropping the bomb on Hiroshima and Nagasaki, as well as all the other armed conflicts that used this wonderful invention. Finally they are told about the numbers of people that will die as passenger planes crash into hills, oceans, and fields all across the world.


    Instead of flying the plane, they decide that the risks are too great, and scrap the whole invention. Upon hearing the details about the possible future of the machine, congress legislates that it is illegal to develop, own, or operate such manned flying machines...


    Just imagine.

    • Re:Just Imagine. (Score:3, Insightful)

      by ellem ( 147712 )
      and so France makes one. Math is not owned by the USA and math makes Encryption. I understand that there are smart people outside of my country, although I haven't personally encountered any.

      I like the idea of your story but it only works if the whole world is run by the USA.
  • by Robber Baron ( 112304 ) on Tuesday September 25, 2001 @01:19PM (#2347767) Homepage
    After about the 4th day I stopped watching the "news" coverage of the WTC disaster. Basically about the same time the talking heads ran out of things to say. Wake me up when the barrage of pseudoinfo-diarrhoea ceases and they've got something new to say.

    We don't even know if the terrorists used encryption. We do know that they used American technology against Americans. Technology manufactured by Boeing...gee, don't hear Boeing engineers wailing about the "ethics" of design features of the 767, do we? Besides, smart people in other countries write encryption all the time...how are you going stop that? What they simply used a seemingly innocuous set of phrases with pre-determined meanings?

    This article is nothing more than more of the same pseudoinformation (propaganda?) that the American media has been bombarding us with. The corporate propaganda machine is in full cry, preparing Joe-sixpack for the loss of freedom that is soon to come. Herr Goebbels would have been proud.

    What about all the technological advances by the Americans that allow them to exert brutal dominion over other parts of the world? A discussion of ethical concerns and science could prove most embarrasing to America.

    In any case, scientists should only concern themselves with "is it possible?" not "should we make it available?"
    • At this point, we can pretty much say that encryption was not used. If the various organizations that begin in darkness and end in the letter "A" had been able to come up with something -- anything -- other than "We Blew It" as a reason for not catching this, they would have mentioned it by now.

      With a plan like this, set up years in advance and not needing to be executed on any specific date, they only needed to transmit one of two messages: 1)"Proceed According To Plan" and 2)"Stop; Wait For The Next Courier From Jihadistan". It's trivial to come up with two utterly unnoticeable code words.

  • by serutan ( 259622 ) <snoopdoug@RABBIT ... minus herbivore> on Tuesday September 25, 2001 @01:23PM (#2347790) Homepage
    In my opinion Martin Hellman is no more responsible for the WTC bombings than Rod Serling, who originated the idea of airline hijacking in his 1966 movie, "The Doomsday Flight."

    For the rest of his life Serling regretted putting this concept into the public mind. But it was only a matter of time before somebody figured it out. At that time there were no metal detectors. Airports were like high-class bus stations. It wasn't Serling's fault that the security systems we have become accustomed to, as well as those we are going to start seeing now, are installed only after damage has been done rather than after the warnings have been sounded.

    Like it or not, we have had the technology tiger by the tail for a long time. Cropdusting planes were grounded nationwide this weekend because of the possibility of biochemical attack. Why now? Cropdusting planes and biochemical weapons have both been around for ages. The possibility of putting them together didn't just pop into existence last week. It's one of many things that the authorities have long known could happen, probably will happen, but hasn't happened yet so no need to alarm people.

    I'm sure quite a number of freedoms we have long enjoyed, simply because nobody has figured out how to wreak mayhem with them, will be going away soon. But don't blame it on Martin Hellman or Rod Serling, or the first proto-human who noticed that you could use a stick to hit stuff with. Blame it on the fact that some people are just assholes.
    • I still like the idea of upgrading the security within the cockpit. Whether its a thumbprint scanner on the controls or a retina scanner for access to the cabin area, some technology upgrades are probably due.
  • by sharkey ( 16670 ) on Tuesday September 25, 2001 @01:26PM (#2347810)
    One nanotechnology expert, Glenn H. Reynolds, a law professor at the University of Tennessee, said that someday it might even be used to make tiny robots that would lodge in people's brains and make them truly love Big Brother.

    Well, they'd have to. That show fucking sucked.
  • Can't stop thought (Score:3, Insightful)

    by glasslemur ( 238045 ) on Tuesday September 25, 2001 @01:30PM (#2347839)
    Cryptography is based on math formulas. Last time I checked, knowledge of math was not confined to the US. Basic cryptography can be done with very large prime numbers, not a difficult math concept, but hard as hell to factor.

    Besides, any idea, over our entire history, was probably not thought up by only one person, even though usually only one person gets the credit for it.

    Preventing someone from advancing in ANY technology, only puts them behind. If a US mathematician doesn't think of it and publish it, someone else will. To protect against something, you have to understand how it works first. You have to have guns with bullets to make bullet proof vests. You have to have a virus to find the cure. (I hate bad analogies, but since they're all the rage).

    I think the farther cryptographers and mathematicians advance, the more useless the old technology becomes. Remember RSA Labs 56 bit key?

    Thoughts and ideas should never be outlawed.
  • War (Score:3, Insightful)

    by KjetilK ( 186133 ) <kjetil AT kjernsmo DOT net> on Tuesday September 25, 2001 @01:38PM (#2347925) Homepage Journal
    From the article:

    People who want to hurt you can find a way to do it.

    Oh yes. And that is why the only option is to make sure nobody wants to hurt you. From the Russell-Einstein Manifesto [pugwash.org]:

    Here, then, is the problem which we present to you, stark and dreadful and inescapable: Shall we put an end to the human race; or shall mankind renounce war?

    It's up to you.

  • Why encryption research people should be the only ones to feel bad? What about people in the knife industry, in the airplane industry, ...

    I'm pretty sure that even if the hijackers had used guns, americans would still have though that banning crypto is better than banning guns.
  • Science is a collection of truths and hypotheses. In and of itself, it can be likened to collecting sea shells. (In fact, I think Carl Sagan probably did.)


    The sea shell still exists, whether person A picks it up or not. Likewise, an algorithm still makes a sound in a forest, whether or not there is anybody to hear it. (Oops! Mixed metaphor! :)


    In consequence, if a researcher comes up with an algorithm for encryption, then that researcher did not "cause" that algorithm to exist; it existed all on its own and the researcher happened to find it.


    For this reason, criminalizing certain algorithms is stupid. You might as well order the tide not to come in (ask King Canute how well -that- one worked!), or not to have white sea shells.


    HOWEVER, if you argue from the standpoint that the algorithm has independent existance and is a fundamental fact of nature, whether it's being used or not, you run into a problem -- you can't patent something that was never created. You can copyright an implementation, sure, the same way you can copyright a photograph of a mountain, but all software patents would go right out the window. Personally, I like that idea a lot, but I respect the fact that a lot of other people would be horrified at the notion. I also respect the fact that there are a lot more of them than there are of me, which makes my preference on the matter somewhat irrelevent.


    When all is said and done, though, the ethics of any kind of research ultimately hinges on whether you view something as a discovery or a product. If it's a discovery, ethics aren't involved. You can't blame a mountain for the actions of others. If it's a product, then its creator carries some responsibility.


    That doesn't mean that "discoverers" are able to escape the consequences of their actions, but it is ONLY their actions they can be held accountable for, NOT the discovery.

  • Funny thing. (Score:4, Interesting)

    by supabeast! ( 84658 ) on Tuesday September 25, 2001 @01:48PM (#2348044)
    Anyone notice that the mainstream media is doing plenty of coverage about the afwul hackers who post free encryption, but very little coverage about things like ethics and airline security? I can't remember the last time I saw anyone in the media write about the fact that there are hardly any checks on people who buy huge quantities of fertilizer that can be used in truck bombs.

    While much of the media coverage of encryption lately has been somewhat insightful, it seems that most of it is more reactionary crap. The media is afraid to demonize airlines for horribly mismanaging their entire industry to the point that they cut corners, often illegally on airline security. Maybe it has something to do with the massive amounts of advertising airlines pay for every year, especially right now when they are advertising dirt cheap fares to try and woo back scared travelers.

    It just goes to show the biggest downside of massive media corporations; instead of being accountable to the masses, they are accountable to the advertisers.

    I will close with a quote, source unknown:
    "The media is only as liberal as the companies that own it."
  • by LordNimon ( 85072 ) on Tuesday September 25, 2001 @01:53PM (#2348109)
    Here's a letter I sent last week. I posted this on another thread, but here it is again for those who missed it. I'm allowing anyone to use this letter as a template for their communiques, on the conditition that you modify it so that it doesn't look like it's a complete rip-off.

    -------------

    Dear Senator/Congressman:

    This week, you and all other Congressmen are very busy preparing new laws and modifying existing ones to help the United States combat terrorism. Unfortunately, I fear that some of these laws will do more to restrict loyal Americans than actually stop terrorists. I hope you can take a few minutes out of your schedule to read this letter.

    To put it bluntly, restrictions on encryption technology are pointless. There have been reports that the terrorist networks responsible for the World Trade Center attack used encryption technology in their communication. Many people, none of whom truly understands technology, believe that if there had been limits on encryption, it would have hampered the terrorists. This assertion is absurd.

    Encryption is nothing more than a field of mathematics, where the data to be encrypted is treated as a bunch of numbers. Placing legal limits on encryption is the same as outlawing certain kinds of math. One of the worst ideas being proposed is to force individuals and companies to use encryption technologies for which the government has "back door" access. That is, the government is in possession of secret keys that can decrypt any data which is encrypted using these particular algorithms. Other encryption algorithms which don't allow for back doors would be outlawed.

    The flaw in this reasoning is that it is impossible to force terrorists to use "approved" technology. We don't even know who or where they are, so how can we force them to do anything?!? The terrorists will simply use "non-approved" encryption technologies while honest American citizens and businesses are forced to sacrifice their privacy. The worst part is that if other countries were to ever obtain these secret keys, they would have access to every piece of encrypted data from the United States.

    The truth is, strong encryption protects Americans. With strong encryption, terrorists won't be able to decrypt sensitive corporate data. They won't be able to spy on American citizens. They won't be able to intercept top secret transmissions.

    These terrorists were able to strike not because they used encryption, but because our intelligence organizations are incompetent. The FBI is better known for its blunders (e.g. the Atlanta Olympics bombing, the siege at Waco, the assault at Ruby Ridge, and the 3000 documents in the McVeigh case) than for its successes. In fact, it's been over a week since the attack, and the best our government can say is, "We're pretty sure that Osama bin Ladin is the prime suspect."

    Therefore, I am asking you to reject any bills that place limitations on the use of encryption. Instead, I think you should focus on how to improve our intelligence-gathering organizations. Perhaps in exchange for bailing out the airline industry, federal officials from the intelligence organizations should get free flights for the next ten years. The money saved can be used to fund more operations.

    • PS, I once wrote a similar letter about gun registration in Canada ... we don't allow personal small arms (in most cases), but you are allowed to own a hunting rifle. Since most crimes in Canada involving guns are committed with guns that were illegally acquired in the first place, it was hard for me to figure out why the mandatory registration of hunting rifles would stifle gun-related crime ... but simpler people don't seem to grasp these concepts.

      "They're already criminals ... duh" comes to mind everytime one of these things comes up.
    • Just one thing I would disagree with here: because our intelligence organizations are incompetent. They often seem to be incompetent, and certainly are overly cautious about sending agents into dangerous situations, but to be fair, no one has had much success at infiltrating middle-eastern terrorist organizations. Their cells are usually family-based, in societies where family comes before everything else. An agent can't get in unless he IS related -- and then he wouldn't betray his family... And Bin Laden has thousands of such families to choose from. This plot may have involved less than 100 people, with only a dozen that knew the plan. If Bin Laden couldn't find a team that were completely loyal, not known to our agencies, and had the sense not to inadvertently reveal the plan or themselves, then _he_ would be quite incompetent.

      So what could have stopped it? It's quite simple. In defending a military position, you need at least three rings of defense. Outermost is a light screen that you hope will provide advance warning of an attack; in this case, that's intelligence, and it's not ever going to be completely reliable. Second is an area defense: that's the metal detectors and x-ray machines at the airports, and in spite of depending on $6/hr rent-a-cops, it worked as planned. No weapon worthy of the name was used. Third, you need to defend the target itself -- and that is where we failed. The hijackings could have been stopped by one air marshal, one copilot with a gun, two on-leave marines with sticks, or a dozen passengers throwing carry-on luggage. But the passengers and crew were not only disarmed, but also brainwashed into not resisting...
      • I was wrong, it was incompetence. Just found a Washington Post article [washingtonpost.com] about how the FBI did know that suspected terrorists were taking flight lessons, and convicted one man in 1995 of plotting to kamikaze CIA headquarters.

        But intelligence failures like this are common where any really daring plan is concerned. Intelligence analysis is like trying to build a jigsaw puzzle with only 10% of the pieces -- it's easy to get it totally wrong, and the analysts are quite aware of this. So when the analysts just don't believe the other guys can be thinking that big or would take such risks, they might make the connections, but then don't believe it. Pearl Harbor is one obvious example, but not the worst one -- that's a tie between Stalin's utter surprise when Hitler invaded in spite of all sorts of warnings (not to mention the obvious fact that Hitler HAD to go for the Soviets' oil fields while his tanks still had enough fuel to get there), and Hitler's total belief in the disinformation spread by the OSS that the big landing was going to be in Brittany, Normandy was just a feint...
  • I'm sure this attack is jsut a graet opportunity for them to attack cyrpto, "hackers", and all non-PAC donating computer users everywhere, but golly, wouldn't it be neat if we addressed the reasons why the US is hated by so much of the world, rather than outlawing Math?


    We funded Bin Laden, we trained him, and when the chickens come home to roost, we figure outlawing numbers, or even worse, intentionally putting a way into your crypto will make it all better.


    Write your Congressperson / Senator with strong words about strong crypo, and all the other attacks on our civil liberties that are in the works right now.

  • GMOs in the wild? (Score:3, Insightful)

    by code_rage ( 130128 ) on Tuesday September 25, 2001 @02:15PM (#2348302)
    The article says that geneticists delayed the development of transgenic technology in the 1970's until scientists' fears of germ warfare could be assuaged.


    Hmm. Does this mean that "safeguards" were developed (I cannot imagine what safeguards *could* be developed)? Or does it simply mean that scientists became "comfortable" with the idea, after the passage of some time?


    Currently, the big biochem companies like ConAgra and Monsanto are experimenting with our ecosystem, releasing Genetically Modified Organisms into the wild. Forget sabotage or terrorism, we may screw things up by "accident". Anyone else worried about that?


    • The article says that geneticists delayed the development of transgenic technology in the 1970's until scientists' fears of germ warfare could be assuaged.

      Hmm. Does this mean that "safeguards" were developed (I cannot imagine what safeguards *could* be developed)? Or does it simply mean that scientists became "comfortable" with the idea, after the passage of some time?


      Well, for starters, safe vectors (DNA fragments allowing gene cloning and replication in bacteria and yeast) and "disabled" bacterial strains were developed, ensuring that cloned genes could not replicate outside the laboratory. This wasn't to allay fears of germ warfare but more to ensure that laboratory experiments didn't "escape into the wild" if a careless student poured bacteria down the drain. Of course the time wasn't just so that scientists could become comfortable with the risk, it was a couple of years that were needed to develop the technology to make it safe to do genetic engineering research! Contrary to popular belief, us scientists are thoughtful, intelligent people and not just a bunch of wild-haired crazies (I for one comb my hair regulary) running around telling Igor to pump up the juice so the monster will wake up...

  • It's common knowledge that the NSA eavsedrops on all communications world wide and passes what it feels is relevant to the CIA. The CIA is notorious for passing what it feels relevant on to American Corporations. By corollary it seems that back doors will give US Corps. an even greater advantage over the rest of the businesses in the world. We all know that US Corps. are too moral to take advantage of this. But still.....
  • This article is debating the issue of developing a technology if it has the potential to do harm. Due to recent terrorist attacks, there is a lot of discussion going on in regards to cryptography and nanotechnology. The main gist of the article is that we should develop technologies, even if they can be used for harm. If we don't develop the technologies, we may lose the benefits that we can gain from them - also, if we don't develop it, someone else will and we will be unprepared to deal with it.

    I am not 100% sure where I stand on this issue. As far as encryption technology goes, I am all for it, regardless of the potential for abuse that it has. Encryption is essential to business operations today - without it we just wouldn't have the economy that we do.

    Nanotechnology is interesting, and has the potential to be a great boon to our society. If it can create truly microscopic robots that can be used to save lives, all the better.

    What I don't like is the attitude of the statements in the article - if we don't create it, someone else will, and they will use it for harm. It depresses me that human society is like that. It is quite similar to the development of nuclear weapons - we had to do it before someone else did. It's like a race on a pair of treadmills - each one is racing faster and faster yet they are getting nowhere and they will never win.
  • Famous quote? (Score:3, Interesting)

    by Java Pimp ( 98454 ) on Tuesday September 25, 2001 @04:40PM (#2349311) Homepage
    "If cryptography is outlawed only the outlaws will have cryptography."


    Outlawing encryption is not going to stop people from using it for some malicious purpose. Outlawing guns is not going to stop armed robbery. Outlawing nuclear technology is not going to stop the bomb.


    It really doesn't matter what you create/invent/discover scientifically or technologically, people will find a way to use it to kill people. And the governments of the world are the biggest example of this. One of the first applications of a new technology is how can it be applied to the military. I mean, what was one of the first uses of nuclear technology?


    What is the question here? Should we not perform any scientific research? Should we not improve our technology? Or, if we do, should we just not share it with anyone? (Including ourselves, there are of course spies and criminals among us.) If that's the case, how could anyone benefit from it?


    To not strive forward with technology because evil-doers might use it is absurd! Even though technology is used by a select few to harm others, the benefits far outweigh the unfortunate "evil that men do."

  • At least from a European perspective, there doesn't seem to be as much discussion on the opportunity to export REAL weapons. Cryptographic software could be used by enemies to communicate, but some of all the weapons the US exports could be used by them to actually kill and are used daily by governments and regimes to kill innocent people, breeding new legions of terrorists. And while I see many good uses for PGP, my impaired imagination is not able to find any for a missile-launcher. Maybe it's just that RSA corporation can't lobby the US government as effectively as all the weapon makers.
  • Do you remember a few years ago when the US could not export encryption technology. At the same time Europeans were allowed to use equivilent technology and also export it. But, when exported to the US, the Americans could not allow it to ne reexported.

    Think about it. Do we really think that US restrictions will really stop the rest of the world?? Are we that arrogant?

    There might be the odd mathematician out there!
  • from the article (Score:3, Insightful)

    by Technodummy ( 204943 ) on Tuesday September 25, 2001 @11:46PM (#2351180)


    This is one of the most insightful comments I've read about threats from technology

    "We spend a lot of time worrying about extremely sophisticated threats," he said. "But less sophisticated threats can slip under the radar. People who want to hurt you can find a way to do it."

    This can only be underlined by the events of September 11, where box cutters were used to destroy the WTC.

    Thomas Jefferson said, "The price of liberty is eternal vigilance."

    Vigilance is the answer, not locking the barn door after the horse has bolted.

    Apologies for mixing quotes and clichés

  • This is absolutely fucking rediculous. Why are hard-working, intellectual, middle class, "white collar" tech-geeks some sort of "fring group" or "underclass"? Are we really *that* different than the mainstream? The only thing I can think of is politics. "They" don't like our radical ideas, the subversion of the status quo.

    Anywho, it is completely bogus that the inventor of PGP would come under attack in a time like this. What about all the major corporations around the world (General Electric prime among them) who, through poor controls on nuclear technology, contributed *heavily* to nuclear proliferation? Pakistan and India have the bomb in the here and now because of technology sold (or given) to them by the west (by the US, by Canada, by Germany, by the UK).

    And let us not forget all the corporations selling conventional weapons. Especially so since, unlike nuclear weapons, these ARE used. France sells Mirage fighter jet's and anti-shipping Exocet missiles to just about anyone and they're brother. Corporations in all western countries sell arms of all kinds to countries of all kinds. A timely case in point would be the US made F-16 Israel has been using to attack Palestine.

    Phillip Zimmerman is far from a wealthy man from his efforts to create a free encryption program. And yet, worldwide arms sales topped 35 *billion* dollars last year, half of that sales from the US. And more than 2/3 of those weapons sales were to poor countries. Yes my friends, the industrialized world is the equivalent of the sleezy gun shop on the street corner in the ghetto.

    And yet, do the news media outlets turn a scornfull eye toward these activites? Toward the generation of cold hard cash by selling weapons of war to the poorest countries of the world? It really makes you wonder....

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...