<< Chapter < Page Chapter >> Page >

Here is the central part of florman's argument from analogy quoted from his article, "moral blueprints" (harper's, october 1978, pp. 0-33):

If each person is entitled to medical care and legal representation, is it not equally important that each legitimate business entity, government agency, and citizens' group should have access to expert engineering advice? If so, then it follows that engineers (within the limits of conscience) will sometimes labor on behalf of causes in which they do not believe. Such a tolerant view also makes it easier for engineers to make a living.

What do you think florman means by "within the limits of conscience"?

Nathaniel Borenstein a widely respected expert on intelligent systems found himself under just this kind of situation. A committed pacifist, he assiduously avoided getting involved in military projects, even when asked repeatedly by representives of the military. But something said to him by one of these military representatives led him to reassess his position. Borenstein was asked to develop a training simulation to teach individuals how to work with the nuclear missile launching system. When he found that it involved "embedded training" he became very concerned. To appreciate the full extent of his concern and the reasons that persuaded him to get involved in this project, it is best to turn to his own words:

Borenstein on embedded training

Embedded training, in particular, struck me as a very poor idea. Training by computer simulation has been around for a long time. Embedded training takes this one step further: it does the simulation and training on the actual command and control computer. To exaggerate slightly, whether or not anyone actually dies when you press the "launch missiles" button depends on whether or not there is a little line at the top of the screen that says "SIMULATION."

Borenstein continues

Such a system seems almost designed to promote an accidental nuclear war, and this thought was what persuaded me to attend the workshop in the first place. One can all too easily imagine human error--"I could have sworn it was in the 'simulation' mode--as well as frightening technical possibilities. Perhaps, due to some minor programming bug, the word "SIMULATION" might fail to disappear when it was supposed to. Someone approaching the computer would get the wrong idea of what it was safe to type.

These quotes are taken from: nathaniel s. borenstein, "my life as a nato collaborator" in the bulletin of the atomic scientists, april 1989: 13-20.

    A thought exercise

  • Think of Borenstein's concerns and eventual actions in light of Florman's analogy.
  • Does Borenstein have the obligation to set aside his pacifism to work on correcting this training problem?
  • Does Florman's analogy provide the justification for this? Or is Borenstein acting on the basis of a very different set of arguments?
  • Assume that you are a committed pacifist. Was Borenstein right to set aside his beliefs to work on this project? Did he really set aside his beliefs?

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Civis project - uprm. OpenStax CNX. Nov 20, 2013 Download for free at http://cnx.org/content/col11359/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Civis project - uprm' conversation and receive update notifications?

Ask