Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science Technology

Why Engineers Must Consider the Ethical Implications of Their Work 406

An anonymous reader writes "An article by Abbas El-Zein at The Guardian explores the ethical responsibilities for engineers who create and maintain 'technologies of violence.' He says, 'Engineers who see themselves as builders of the shelter and infrastructure for human needs also use their expertise in order to destroy and kill more efficiently. When doctors or nurses use their knowledge of anatomy in order to torture or conduct medical experiments on helpless subjects, we are rightly outraged. Why doesn't society seem to apply the same standards to engineers? There is more than one answer to the question of course, but two points are especially pertinent: the common good we engineers see ourselves serving and our relationship to authority. ... Our ethics have become mostly technical: how to design properly, how to not cut corners, how to serve our clients well. We work hard to prevent failure of the systems we build, but only in relation to what these systems are meant to do, rather than the way they might actually be utilised, or whether they should have been built at all. We are not amoral, far from it; it's just that we have steered ourselves into a place where our morality has a smaller scope.'"
This discussion has been archived. No new comments can be posted.

Why Engineers Must Consider the Ethical Implications of Their Work

Comments Filter:
  • Already does. (Score:5, Interesting)

    by jythie ( 914043 ) on Thursday December 05, 2013 @02:07PM (#45609991)
    While it is hard to draw exact parallels, society already holds engineers to similar standards to doctors. The outrage over doctors experimenting on helpless test subjects is pretty similar to, say, when engineers use live subjects for testing weapons.
    • by perpenso ( 1613749 ) on Thursday December 05, 2013 @02:16PM (#45610139)

      While it is hard to draw exact parallels, society already holds engineers to similar standards to doctors. The outrage over doctors experimenting on helpless test subjects is pretty similar to, say, when engineers use live subjects for testing weapons.

      Yeah. The article's author is making a poor analogy. Blaming engineers would be more akin to blaming the scalpel designer for the doctor's experimentation. Its not the scalpel or the gun that is the problem, it is the mind and the intentions behind the hand holding the scalpel or gun. Both can be used for good or bad.

      Short of WMD the issue is not as simple as the author suggests.

      • by bob_super ( 3391281 ) on Thursday December 05, 2013 @02:46PM (#45610607)

        "when the rockets go up,
        who cares where they come down?
        it's not my department,
        says Wernher Von Braun."

        He didn't say that, but that summarizes the lives of thousands of people who make a living designing and building weapons (except a few fanatics who relish the though that their weapons kill $bad_guy)

        • by perpenso ( 1613749 ) on Thursday December 05, 2013 @03:01PM (#45610821)

          "when the rockets go up, who cares where they come down? it's not my department, says Wernher Von Braun."

          He didn't say that, but that summarizes the lives of thousands of people who make a living designing and building weapons (except a few fanatics who relish the though that their weapons kill $bad_guy)

          That summarizes perhaps a small minority who design and build weapons. Lets consider the M1 Garand Rifle of the U.S. Army and Marine Corp. It was designed during peace time in the 1920s and 30s. It was used to destroy the Third Reich in Europe in the 1940s and in the 1960s it was used by some panicked National Guard to kill students at Kent State University in Ohio. Most of the engineers envisioned a use of the "destroy the Third Reich" type, not the students at Kent State type.

          • My point, with is close to your point, is that the engineer designs a tool. The manufacturer builds a tool. They sell it to someone who the government says will be using it within the framework or the law, or at least in accordance to our interests.

            Since the end of the Korean war, it takes longer than most US wars have lasted to build a _new_ weapon (Afghanistan is the exception that confirms the rule). So you don't start designing a weapon thinking how great it will kill $bad_guy, hoping they're still shoo

      • by TheLink ( 130905 )
        But on a related topic I think we really should start seriously considering whether we should postpone certain paths of research instead of just doing things because we can. Too often we are doing things just because the technology is ready. Whether society and laws are ready, doesn't even get a consideration.

        For example: the creation of viable human-animal[1] hybrids may be possible in the future. Same goes for certain mixtures of human, cyborg, animals, AIs etc. But if we are not ready to decide whether t
      • Head in the sand (Score:3, Insightful)

        by sjbe ( 173966 )

        Short of WMD the issue is not as simple as the author suggests.

        Really? Explain to me what purpose an M1 tank or an F22 fighter has besides killing people? What humanitarian purpose do land mines serve? Assault rifles? (target shooting? don't make me laugh) Hand grenades? Let's not pretend that the engineers working on these products have no idea what they will be used for. Plausible deniability does not apply to a lot of weapons.

        There are many technologies where the line between ethical and not-so-much is fuzzy but you hardly have to go to WMDs to get there.

        • Presumably most of those engineers would assume these technologies would be used to defend their friends/family/people they care about. I'm sure you would also be the first to put these engineers up against the wall when they refused to design anything that could defend against a bolt action rifle. A single shot is all you need for hunting or target practice after all; anything more is just to kill people.
      • by Ghostworks ( 991012 ) on Thursday December 05, 2013 @03:45PM (#45611599)

        When doctors or nurses use their knowledge of anatomy in order to torture or conduct medical experiments on helpless subjects, we are rightly outraged. Why doesn't society seem to apply the same standards to engineers?

        Whenever I read something like this, I immediately think of Florman's "Existential Pleasures of Engineering" [amazon.com] despite the title, Florman's book is rooted is actually a spirited apology for the engineering profession in an age where everyone was lamenting all the modern horrors that those damned engineers could have prevented if they had just be more ethical.

        As Florman notes, there has been a large focus for the past half-century on making engineers more ethically aware, and it's mostly pointless. Despite what most people seem to believe engineers are not philosopher kings any more than Technology is some sort of self-sufficient, self-empowering beast working counter to the benefits human society. Both do exactly what the rest of society tells (read: pays, begs, and orders) them to do, and nothing more. And while you don't see many engineers saying this -- because when someone tells them that they run the world and hold the future of all man kind in their hands, people are disinclined to temper their ego and deny it -- we only do what the suits pay us to do, and if we don't do that they fire us and move on to someone else who will.

        Let's ask this another way: why aren't business men considering the ethical implications of their investments? Why aren't militaries, bureaucracies, and governments considering the ethical implications of their orders? Why isn't the average person taking five minutes to understand a problem now so he doesn't demand government, the market, and God on high give him an answer that he's going to hate more than the original problem a year from now?

        Every profession has ethical considerations. More ink has been spilled and time spent on the subject of ethics in engineering and practical sciences than any discipline save medicine. And yet it does not solve the problem and will not solve the problem because that is not where the problem lies.

        • Let's ask this another way: why aren't business men considering the ethical implications of their investments? Why aren't militaries, bureaucracies, and governments considering the ethical implications of their orders? Why isn't the average person taking five minutes to understand a problem now so he doesn't demand government, the market, and God on high give him an answer that he's going to hate more than the original problem a year from now?

          TFA presumes that we as engineers are smarter, better humans than everyone else, that we can and should forsee all of the ethical implications of our work. It's pure hubris, and I call shenanigans. By and large, engineers do what their employers pay them to do, to feed themselves and their families. IMO, every link in the chain should be held to the same standard of moral accountability. We're not exempt, but it's unreasonable to expect us to take any more (or less) responsibility for the bad things that

    • I think the point TFA is attempting to make is that engineers should be ethically prohibited from designing weapons at all.

      (Or, perhaps more relevantly and/or reasonably, from designing technologies that enable the NSA's unconstitutional spying.)

      • Re:Already does. (Score:5, Insightful)

        by TechyImmigrant ( 175943 ) on Thursday December 05, 2013 @02:48PM (#45610629) Homepage Journal

        Some of us choose not to design weapons. It isn't theoretical. I've turned down job offers that turned out to be essentially for improving ways to kill people.

        Some of us choose to design technologies that work against the NSA's unconstitutional spying rather than for it. Again this isn't theoretical. I've been presented with some choices and taking the high road is ultimately easier to live with, even though people don't thank you for it at the time.

        The ethical questions for engineers are far, far simpler than those for doctors or politicians. safe good, unsafe bad. Protects people good, exposes people, bad. Kills or injures people, bad, saves people, good.

        Maybe in a world with an agressor and no ready defense technology, the moral landscape would look different. But there is no shortage of military technology. I can choose not to add to it. To add to it is immoral. To not add to it is moral.

        • I fail to see what is inherently immoral or unethical about designing weapons or other technologies for you own country's defense. If you can prevent that missile from landing on an unintended target (civilian) or prevent terrorists from blowing up your fellow citizens, that is also a morally good choice.
        • by Anonymous Coward on Thursday December 05, 2013 @03:22PM (#45611145)

          Some of us choose not to design weapons. It isn't theoretical.

          Actually it is in a way. This strategy only works because you are protected by other engineers who design the weapons that protect you, whether you approve of those weapons or not.

          There is nothing wrong with your moral choice but lets face facts. Pacifists can only exits in absolute isolation or where they are protected by friendly non-pacifists. In the real world there will be unfriendly non-pacifists who will subjugate, enslave or kill you. Regrettably this is the way some humans are wired.

          In the book "Guns, germs, and steel" a warlike group of pacific islanders are mentioned. A subgroup colonizes a new island, loses contact with the original group and in isolation becomes pacifist. When contact is reestablished the subgroup is enslaved. This was done to blood relatives separated by only a small amount of time (in generational terms) with a common culture, language and religion.

      • Re:Already does. (Score:5, Insightful)

        by jellomizer ( 103300 ) on Thursday December 05, 2013 @02:53PM (#45610701)

        That is what I got out of it.

        Think of the Engineers and Scientists who made the a-Bomb.
        1. Don't help and you will be the reason for a sustained war costing millions of lives of mostly military personnel.
        2. Make the A-Bomb that will kill ten thousand civilians and ending the war.

        If I say designed a better targeting system. Did I...
        1. Make a system more capable of destroying people.
        2. Make a system more capable of not hitting the wrong people.

      • Re:Already does. (Score:4, Insightful)

        by dcw3 ( 649211 ) on Thursday December 05, 2013 @02:56PM (#45610737) Journal

        So basically, no engineer could work for DoD. So, let's take that idea to the extreme. We disarm, and nobody is allowed to work on any defensive weapons, and we all sing Kum ba yah. Make sense? Yeah, I didn't think so either.

    • by Guppy06 ( 410832 )

      A medical doctor who participates in a state-sanctioned execution will still find himself in professional jeopardy at home and typically wouldn't be allowed to practice abroad. The same is not true of engineers involved in the design of devices used in state-sanctioned executions.

    • by Mr D from 63 ( 3395377 ) on Thursday December 05, 2013 @02:51PM (#45610677)
      Engineers should be held to the same strict standards as politicians.
  • War Engines (Score:2, Interesting)

    by Anonymous Coward

    We engineers got our historical start by building WAR engines.

  • Like Radio? (Score:4, Insightful)

    by locust ( 6639 ) on Thursday December 05, 2013 @02:09PM (#45610035)

    In Rowanda the slaughter was committed using nothing more than machetes. However the people where whipped into a fervor by people on the radio inciting to violence. Marconi should have seen it coming. He had a responsibility.

    More seriously, to paraphrase stephenson and others. human beings are at the top of the food chain because we are the most effective and fearsome killing machines currently known. We will find new ways to kill things, and each other regardless of the intended purpose of a tool, and how many safe guards are built into it. It is what we do and why we are where we are.

  • by stewsters ( 1406737 ) on Thursday December 05, 2013 @02:10PM (#45610047)
    Technology is a tool, and a tool can be used a weapon. You should blame the one who wields the weapon. Do we blame Pasteur for biological warfare? I do not, but without him much of what we know about making bio-weapons would not exist.

    You can study rockets to go to the moon, but eventually someone is going to shoot them at their neighbor.

    You can study a way to get cheap energy for everyone, but eventually someone will make a bomb.

    You can create a large forum for the people that is resistant to people stopping you from communication, but someone will eventually create a global spy system that watches everything you do.

    It is unfortunate, but I would place the blame not on the person who makes the technology, but the one who decides how to use it. When we complain about doctors helping with torture, we are complaining about the ones there to extend the pain, not the ones who came up with ways keep people alive.
    • Re: (Score:3, Insightful)

      by mounthood ( 993037 )

      It is unfortunate, but I would place the blame not on the person who makes the technology, but the one who decides how to use it.

      When we design something, we're "the one who decides how to use it"; that's part of designing it. The intentions of the designer matter, and if they're evil the designer should be blamed. Consider, If I make a torture device, can I just shrug my shoulders and say 'they decided to use it the way I designed it, so it's their fault'?

      To make it more relatable, if I make a Friendface website where it's easy to share personal info but hard to protect it, should I deserve any of the blame? Even if the users deserv

    • Chemical/biological weapons for example.

    • by CreatureComfort ( 741652 ) on Thursday December 05, 2013 @03:39PM (#45611477)
      In collage, I learned that Aerospace Engineers design weapons. Civil Engineers design targets.

      /B.S.A.E.
  • They do (Score:5, Interesting)

    by phantomfive ( 622387 ) on Thursday December 05, 2013 @02:11PM (#45610067) Journal
    A lot of the engineers I've known who worked on military equipment do consider the ethical implications of their work. They feel they are helping protect our troops (see the beginning of Iron Man 1, where Tony Stark uses a similar justification), or something similar.

    Other guys I know are just happy to have a job. Some people consider it unethical to work in the corporate world at all, so just because you consider something unethical, doesn't mean everyone considers it unethical. The NSA has the purpose of catching terrorists, which is a good goal. The reason we don't like them is because of the abuses, not because of their goal.
    • Re:They do (Score:5, Insightful)

      by Sarten-X ( 1102295 ) on Thursday December 05, 2013 @02:21PM (#45610231) Homepage

      I've worked on military systems before that were designed to enable killing.

      Mine were more accurate than any predecessors, used better sensors than any predecessors, and had better controls than any predecessors. Sure, it's possible to send it off to kill civilians, but if you're aiming for the bad guy, it will kill only the bad guy, and not the schoolchildren next door.

      To me, that's ethical. It'd be great if we could stop killing each other, but until that happens I'm going to do my best to keep everybody outside the conflict safe.

    • Re:They do (Score:5, Interesting)

      by Anonymous Coward on Thursday December 05, 2013 @02:23PM (#45610277)
      (posting AC due to content)

      I worked on a military project in the early part of the 2000s focused on cracking cell-phone encryption technology. At the time, it was just an interesting problem, with the potential to maybe "help fight terrorism". I didn't really think a lot about the implications of the work, I was just glad to have an interesting job.

      I've now got a bit more perspective. Maybe the technology I worked on helped us catch Osama bin Laden. Or maybe it's helping the NSA listen to American citizens. Maybe both.

      I don't think it's as simple as saying that engineers are responsible for all of the uses of the technologies they build, but I don't think we can ignore all responsibility either. Something I think about a lot these days.
    • Re:They do (Score:4, Insightful)

      by JeanCroix ( 99825 ) on Thursday December 05, 2013 @02:24PM (#45610289) Journal

      There is certainly a sort of "mercenary" ethic amongst many defense engineers. As long as there are soldiers willing to pull triggers, there will be engineers willing to design the guns. As well as simple game-theory type reasoning - "I can take the pay for this job; but if I don't, they'll find someone else who will." I get the feeling the article author doesn't know and didn't really talk with any longtime defense engineers - professors can be quite removed from that world.

      And this is to say nothing of the defense engineers who are actually gung-ho about their work.

    • And most of the the money that went into AI and ML research came from milatery/defense sources and i know people who switched out of AI because of it.
    • A lot of the engineers I've known who worked on military equipment do consider the ethical implications of their work. They feel they are helping protect our troops ...

      I graduated with a dual BS in Mechanical Engineering and Computer Science. At graduation time, I very much wanted to be an engineer rather than a programmer, but I also didn't want to contribute to war in any capacity; so I narrowly focused my job search on employers who were NOT in the defense sector. Nearly everyone I told about my decision gave me the very same argument as you. My self-imposed restrictions certainly made my job search harder, so I expanded my search to programming where I found a sat

  • by tomhath ( 637240 ) on Thursday December 05, 2013 @02:12PM (#45610079)
    Saying an engineer shouldn't design a better weapon is like saying a doctor shouldn't treat a wounded soldier.
    • by spiffmastercow ( 1001386 ) on Thursday December 05, 2013 @02:20PM (#45610223)

      Saying an engineer shouldn't design a better weapon is like saying a doctor shouldn't treat a wounded soldier.

      No, saying an engineer shouldn't design a better weapon is like saying a doctor shouldn't culture anthrax for the military to use as a weapon. I'm not particularly opposed to designing better weapons for the military (it will happen regardless), but it does seem engineers are held to different ethical standards than medical docs. Not necessarily better or worse standards, mind you, just standards more suited to the job they perform.

    • Saying an engineer shouldn't design a better weapon is like saying a doctor shouldn't treat a wounded soldier.

      Ethical doctors treat the wounded on both sides. Can most (military) weapon designers make the same claim?

      I see nothing unethical about designing or manufacturing weapons for defensive use, so long as you sell them indiscriminately to anyone in need of defense. Knowingly designing or manufacturing weapons for use by an aggressor, on the other hand, would make you complicit—much like selling a weapon to someone knowing that they plan to use it to rob a bank or commit a murder.

      • Please provide a working definition of aggressor that stands up beyond ethical idealism. It can't just be going off to fight on soil that isn't your own, because that makes the US wrong for stepping in to help stop Hitler. On the other end, it can't be any conceivable contrivance like invading a oil-rich country because your cabinet member owns an oil company and he needs a little extra pressure.
  • "When doctors or nurses use their knowledge of anatomy in order to torture or conduct medical experiments on helpless subjects, we are rightly outraged. Why doesn't society seem to apply the same standards to engineers?" If the doctor uses their knowledge of anatomy to conduct medical experiments, do we blame the person that created the tool, the tool, or do we blame the doctor? When someone uses a weapon to kill someone, do we blame the person that created the weapon, the weapon, or the person using the w
    • They have no moral obligation to prevent it from being used incorrectly.

      How about weapons of mass destruction?
      And consider the asymmetric case, where 1 party has access to this technology but the other(s) do not.

      • How about weapons of mass destruction?.

        As others have pointed out, the items are merely 'tools' and the application the tool is where the morality lies.

        Could you imagine a world where we routinely use nuclear weapons to relieve stresses in the Earth's crust and prevent large earthquakes and their devastating effects? What about stopping large oil spills quickly?

        Using a nuclear weapon on an oil spill. [nytimes.com]

        Unfortunately, many helpful adaptations of large scale explosives are not being utilized due to the political implications, but even a 'weapon

  • by Flozzin ( 626330 ) on Thursday December 05, 2013 @02:19PM (#45610201)
    Biological and chemical weapons were used, are used. It is just that now the world decries their use. If you look at WWI for instance. So it wasn't always the case that doctors were persecuted for using their knowledge for war. With engineers, creating conventional weapons, that is something accepted by the world. There is no moral outrage(on a large enough scale to matter)against a 500lbs bomb. When it comes to conventional weapons, everyone accepts the risks. We realize we need defense, so they are good. If someone uses them for offense, or evil instead, then that person is blamed, not the engineer. Should people that create steak knives also think about the ethical implications if someone gets stabbed with their knife? What if a car is used for violence? I realize these aren't the best examples, but my point is intended use. That is what matters. If an engineer creates a single weapon that will destroy the planet, then you will have your outrage. There is no point to such a weapon. Nuclear weapons are close to that, but they have been used to save lives as well.(est. that over 1 million Americans would have died invading Japan, along with millions of Japanese) There is nothing wrong with stepping back and saying, "Am I morally Ok with what I am creating?" But when it comes to conventional and nuclear weapons, if someone says no, then there will be hundreds to take your place. Military technology also trickles down to the general public and improves their lives as well.
  • Lots of gray areas (Score:4, Insightful)

    by MrEricSir ( 398214 ) on Thursday December 05, 2013 @02:20PM (#45610227) Homepage

    Designing a missile system to kill lots of brown people on the other side of the world is not very ethically ambiguous. Thing is, there are plenty of technologies that are.

    For example, DARPA has been doing lots of research on robots. They point out how self-driving cars can save lives, robots can find and defuse bombs and rescue victims, etc. But these technologies can be used for war just as easily.

    • Designing a missile system to kill lots of brown people on the other side of the world is not very ethically ambiguous

      It's also a strawman. People don't design missile systems to kill people just because they are brown; that's not why the Iraq war happened. Either one of them.

  • Even if I could develop a morality engine and install it in every device, system, and process I've ever worked on, I don't think I would. Not only is it too comnplex a problem, it subverts the morals of the user and substitutes my own. And I Know I don't have the far ranging vision to appreciate the fine points of every potential future situation to evaluate them properly. It is hard enough to do that well in real time, with all or most of the facts and evidence present for examination.

    Any engineer, act

  • by david.emery ( 127135 ) on Thursday December 05, 2013 @02:31PM (#45610373)

    in the US Military-Industrial Complex for most of the last 35 years.

    If that doesn't match your ethics, that's OK.

  • Kevin Smith - Clerks on ethics in contractors [youtube.com]

    BLUE-COLLAR MAN: Excuse me. I don't mean to interrupt, but what were you talking about?
    RANDAL: The ending of Return of the Jedi.
    DANTE: My friend is trying to convince me that any contractors working on the uncompleted Death Star were innocent victims when the space station was destroyed by the rebels.
    BLUE-COLLAR MAN: Well, I'm a contractor myself. I'm a roofer... (digs into pocket and produces business card) Dunn and Reddy Home Improvements. And speaking as a

    • by neminem ( 561346 )

      Interestingly enough, according to canon this actually wasn't true. The Star Wars canon actually provides my favorite (fictional) example of an engineer who completely failed to consider what the project they were working on would be used for. There was a whole super-top-secret space lab, where a bunch of engineers worked on weapons of unimaginably massive destruction such as the Death Star... while being fed bull about how they'd be used for good (I recall the Death Star specifically was supposed to be use

    • I was just about to post this. Beautiful example of the dialogue that made Kevin Smith.

  • by WillAffleckUW ( 858324 ) on Thursday December 05, 2013 @02:39PM (#45610489) Homepage Journal

    As someone going for a PhD in Civil and Environmental Engineering, my basic response to Mr. El-Zein is ...

    Sod off.

    Now stop thinking that the world needs to fix the Middle East or care about your "problems".

    Oil is over. Nobody cares about you anymore.

  • An engineer should just look into the license.txt files that came with the particular technologies he/she used.

    If, for example, it says: "this technology SHALL NOT be used to harm people", then you should either not build weapons with it, or you should search for another technology with a more liberal license.

  • I have long thought it was time for OSS licenses to support a morality clause, that does not grant license to the software when the software is used to extinguish life or violate the rights of people. This, if applied to Linux, would prohibit use of Linux in military applications, like that sniper rifle as well as a number of drones.

    I have long taken a moral exception to working for defense contractors, especially since 9-11 when we started spying on everyone and killing people with drones. However Linux/OS

  • and not just those who happen to work in today's "hot button issues" Amazon and its massive JIT warehousing has a social impact just as much as working on a new drone avionics package - and economists and accountants who need to have and follow an ethical code - if you develop a tax doge that damages poor people ok in his book?
  • by Koreantoast ( 527520 ) on Thursday December 05, 2013 @02:43PM (#45610547)
    Most engineers I've met who work in defense do not wake up every morning thinking about more efficient ways to kill women and children. They wake up, believing that what they do furthers the protection of their families, fellow citizens and their homeland. Doesn't matter if the engineer is an American, Chinese, Russian, Israeli, Iranian, etc., most pretty much think that what they do is going to create a better and safer world for their loved ones. The engineers at the NSA, and I would even argue their most senior leadership, likely believe that what they do is for the benefit of the United States. I think there's plenty of room to argue whether or not their assumptions and ethical standards are correct, but to imply that they're not thinking about this at all or simply creating superweapons for sport with no care about their end uses is overly simplistic.
  • by Okian Warrior ( 537106 ) on Thursday December 05, 2013 @02:46PM (#45610599) Homepage Journal

    When doctors or nurses use their knowledge of anatomy in order to torture or conduct medical experiments on helpless subjects, we are rightly outraged. Why doesn't society seem to apply the same standards to engineers?

    When a doctor tortures a patient there is a direct cause and effect from the doctor's actions to the pain and suffering of the victim.

    When an engineer designs a weapon, he's not actually causing the pain and suffering. Once you get away from "complete responsibility", the rest is easy:

    1) If I don't do it, someone else will
    2) I need to feed myself and my family
    3) It'll only be used on the bad guys
    4) It helps protect my country
    5) It's the user's responsibility, not mine
    6) The boss thinks it's a good idea
    7) It has significant non-evil uses
    8) No one will ever know it was me

    For a concrete example, consider the Collateral Murder [youtube.com] video from a couple of years back. Who was responsible for these deaths?

    The helicopter pilots got the go-ahead from their commanders, the commanders [probably] got the go-ahead from intelligence services, the services made the correct decision based on the information they had, and the information was somehow "wrong".

    Who's to blame for the collateral murder incident? By deftly distributing blame among many players, it changes from personal responsibility to "a failure of the system", or "a tragic accident".

    For a second example, consider Bush's Iraq war: he was on TV stating that he had convincing evidence of WMDs in Iraq. A couple of years later it came out that the intelligence services had never said this and tried to convince the president of the opposite. Bush's response was: "We [the administration] didn't get the message". (Note the use of "we" in his statement.)

    Who's responsible for the war? The President says he got bad intelligence, the intelligence services say they never gave bad intelligence. It's impossible to lay the blame on someone, it's a "failure of the system".

    But don't worry, the problem is fixed - it'll never happen again.

    (Epilogue: The Gulf oil spill was largely enabled by failures of the Minerals Management Service, who is responsible for overseeing the safety procedures of off-shore drilling. The problems were largely fixed by renaming the service to Bureau of Ocean Management [boemre.gov]. The problem is fixed, now we won't have any more disasters. Sorry about that...)

  • Why shouldn't I work for the N.S.A.? That's a tough one, but I'll take a shot. Say I'm working at N.S.A. Somebody puts a code on my desk, something nobody else can break. Maybe I take a shot at it and maybe I break it. And I'm real happy with myself, 'cause I did my job well. But maybe that code was the location of some rebel army in North Africa or the Middle East. Once they have that location, they bomb the village where the rebels were hiding and fifteen hundred people I never met, never had no problem w

  • I won't go into specifics, but for me a few extra dollars or potential for advancement would *not* compensate for the lifetime of guilt I'd suffer knowing something I built or contributed to was primarily designed to do harm. Likewise, I will lose respect for those in a similar position to me who willingly contribute or design those systems.

    On the other end of the scale, folks struggling to get by have my sympathy when assigned tasks like this. Food on the table and a roof over their family's head may tr
  • I Faced That Dilemma (Score:5, Interesting)

    by Carol Anne Ogdin ( 3404765 ) on Thursday December 05, 2013 @03:18PM (#45611081)
    I was brought in to a government contractor's project as consultant during the Vietnam War. They were having severe problems with building their software system, and expected me to help them identify the root causes. For two weeks, they hemmed-and-hawed, trying to keep from telling me the true purpose of the system. Finally, when it was clear they couldn't understand the root problems themselves, they briefed me on what the system was ACTUALLY intended to accomplish. They did this on a Friday.

    I was appalled that American citizens could dream up such an incredibly horrible intention: I can't say more, but the goal (in part) was to efficiently kill innocent civilians.

    My choice was clear: I packed up, went to the airport, and bought a ticket home. On Monday, I was back at my regular desk. There was simply no way my conscience would allow me to optimize the schedule and effectiveness of such a project. There was never any repercussion, from anyone. I understand the contract was cancelled for non-performance several months later.

    We who understand technology need to make value judgements: Do YOU want to write code that disadvantages fellow citizens? Do YOU want to create systems that transfer wealth from middle-class to rich folks? Do YOU want to write code that has secrets that could harm someone in the future buried inside? Do YOU want to make money by cheating ordinary citizens (think High-frequency "trading")? Do YOU want to see more systems, like NSA's, that violate the constitution, the law, and common decency?

    I didn't, and I don't. Stand up for what you believe.
  • by EMG at MU ( 1194965 ) on Thursday December 05, 2013 @03:20PM (#45611103)
    Does a teller at JP Morgan think about the ethical implications of their work?
    Does a cashier at WalMart think about the ethical implications of their work?

    In both examples a person that is in no way responsible for the overall direction of the company is facilitating the daily operations of the company that will commit unethical acts.

    We cant just ask Engineers to sacrifice their careers because some whiny Journalist/Engineer is having a moral crisis. Our entire society is unethical. We buy clothes that were made by people working for less than $100 a month, living in concrete rooms smaller than most jail cells, who were forced into that labor by their parents. We shop at companies who are lobbying to oppress workers rights. We use electronics made by children and people who would rather kill themselves than continue working at Foxconn.

    Get off of your fucking high horse and stop acting like an Engineer is any different than a Banker or a CEO or a cashier. We are all players in the same fuckedup game.
  • by cfalcon ( 779563 ) on Thursday December 05, 2013 @04:38PM (#45612233)

    The big difference with medical knowledge being used to torture versus, say, design of a weapon, is that weapons have moral use. Defense is inherently moral, and technology makes that safer and better. Technology used to kill is the same as a sword being used to kill, is the same as a rock- if the man behind the tool is working towards good, defending his nation / family / self, then the action, and the weapon is moral.

    Torture, on the other hand, is always wrong. But that doesn't make scalpels evil, or handguns, or rockets.

  • And then... (Score:5, Interesting)

    by Jiro ( 131519 ) on Thursday December 05, 2013 @04:46PM (#45612315)

    You end up with the engineering equivalent of pharmacists refusing to sell contraceptives because they think that contraceptives are immoral.

    It's amazing how everyone who says "so-and-so profession must consider the ethical implications of their work" always imagines that the person considering the ethics happens to be considering ethics that they agree with. They never think that they might consider unethical contraceptives, abortion, gay sex, miscegenation, etc.

    We *want* apartment owners to say "If you use that apartment for sodomy, I am not responsible just because I rented you the apartment you used to do it in."

  • by Overzeetop ( 214511 ) on Thursday December 05, 2013 @04:54PM (#45612411) Journal

    ...who cares vere zay come down.
    Zat's not my department,"
    says Werner Von Braun.

  • by EmperorOfCanada ( 1332175 ) on Thursday December 05, 2013 @05:12PM (#45612635)
    I suspect that another key problem with ethics is that many evil things are more fun to do, think about, and tell people than the boring things. If you tell people that you are an HVAC engineer they will think, "BORING!!!" yet the reality is that you make many poeples' lives more comfortable, and with good designs, save energy, and create healthy environments. But if you tell people you are the inventor of the hellfire missile or build nuclear bombs then they will go, "oooooooh"

    I'm not sure there is any limit to the "coolness" of what is effectively sociopathic behavior; If you tell tech people that you are building a military robot that is designed to hunt and then jump onto the faces of the Taliban (alien style) and stuff a GPS tracker/grenade down their throat that forces them to surrender or be blown up from the inside then you will make headline news. If you develop a way to make cheap home wiring that conducts better than silver you are back to boring.

    The above evil will get a few people to gasp in horror but most people will want to know more.

    Now normally the defense industry goes through spasms of peace and the engineers face huge layoffs. But this time around the US is effectively doing a War on Fear which will pretty well never end. So if you can invent a tool for annihilating the boogy man then you will remain solidly employed. I you are inventing somethings solar that reduces fossil fuel use then your employment will be fitful at best.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...