News from
PRINCETON UNIVERSITY
Communications and Publications
Stanhope Hall
Princeton, New Jersey 08544-5264
TEL 609/258-3600 FAX 609/258-1301

Release: April 7, 1995
Contact: Tom Krattenmaker (609/258-5748)


Computer Ethics: Princeton's
Helen Nissenbaum Is Helping
Develop Emerging Field

Editors: Helen Nissenbaum, a teacher and writer in the field of
computer ethics, could prove a valuable source for stories on
computing issues and controversies. Contact the Princeton's
Communications Office at the above number for an interview or more
information. The following is an article on Nissenbaum that
appears in the Princeton Weekly Bulletin, the university's newspaper.

PRINCETON, N.J. -- A computer controls a piece of medical
equipment administering doses of radiation to patients. Something
goes wrong, and six patients are subjected to radical overdoses.
Three are injured. Three die.

Is it the computer's fault? Who or what is responsible?

Those are among the questions that currently engage Helen
Nissenbaum, a computer ethicist who is associate director of
Princeton University's Center for Human Values. Nissenbaum and
Deborah Johnson of Rensselaer Polytechnic University have coedited
a new book, _Computers, Ethics and Social Values_, which pulls
together articles by dozens of scholars and ethicists in a first-
of-its-kind reader for computer ethics courses.

Society, Nissenbaum says, has not gotten around to resolving many
of the moral, legal and ethical questions raised by the rapid
entry of computing into most walks of human life. Accountability
is a prime case in point. ``As a society, we value
accountability,'' Nissenbaum says. ``We want to know who is
responsible when something goes wrong and someone or something is
damaged. Computing, however, can erode accountability in a number
of different ways.''

Take the example of the medical equipment, a true story from the
mid-1980s that Nissenbaum recounts in one of the two articles she
contributed to the new book. The machine -Ñ the Therac-25,
manufactured by Atomic Energy of Canada Ltd. -Ñ had numerous
faults, including at least two significant software coding errors.

It may be tempting to assign responsibility to the people who
wrote the defective software, but that is probably too simple an
answer, Nissenbaum says. What about the machine's designers, who
included inadequate safety checks to protect patients in the event
of computer error? What about hospital technicians who operated
the equipment, apparently oblivious to the damage being inflicted
on patients? What about the hospital's and manufacturer's
executives, those ultimately responsible for their institutions'
procedures, policies and priorities?

``Because no individual was both an obvious causal antecedent and
decision-maker, it was difficult, at least on the face of it, to
identify who should have stepped forward and assumed
responsibility,'' Nissenbaum writes in ``Computing and
Accountability.'' ``We should not, however, confuse the obscuring
of accountability ... with the absence of blameworthiness.''

Computing has peculiarities that make accountability issues
especially difficult, Nissenbaum notes. Besides the fact that many
hands are usually involved in the development of equipment and
software, there is the general acknowledgment and even acceptance
of programming errors, or ``bugs,'' as unavoidable and forgivable.
There is probably no other industry whose products come with
literature announcing the possibility of defects and disclaiming
any responsibility for the harm they cause.

Also setting computers apart from other technologies is their
capacity to make decisions. ``The computer is making decisions
that humans would and for which humans would be held
accountable,'' Nissenbaum says. ``So if something goes wrong, who
is accountable? Unless you figure out some way of reallocating
responsibility, we may find ourselves with a vacuum.''

Episodes like the Therac-25 failure have not always taught us as
much as they could, or as much as we might like, Nissenbaum notes.
Because software developers prefer not to publicize problems with
their products, many lawsuits involving faulty computing are
settled out of court. Society thus loses an opportunity to explore
these important issues of accountability. (Fortunately for
patients treated with the Therac-25, the design and software
glitches were corrected.)

Recent headlines have furnished more examples of computing gone
awry, with serious consequences for society. In March, Intuit Inc.
revealed that its Macintax tax preparation software provided
information that would enable users with modems to access other
people's electronic tax returns, stored in a central computer
system. As a result, thousands of users' returns were exposed to
unauthorized viewing, tampering or deletion.

That revelation came barely a month after the disclosure of bugs
in Macintax and Intuit's other tax preparation software, Turbotax.
The company acknowledged that the bugs could cause erroneous tax
returns. Nissenbaum praises Intuit for announcing the Macintax
flaw, but she says the true test could come later. ``How much
responsibility will they be willing to take if it's discovered
that serious damage was done or that there were widespread
violations of confidentiality?'' she wonders.

How should responsibility be assigned in these and other cases of
faulty computing? Nissenbaum says it is not the role of the
ethicist to provide pat answers, but rather to pose the question
clearly and to draw on a combination of ethical theory and matters
of fact to begin to answer it. This merging of theory with real-
world cases has become her forte in the decade since she began
teaching and writing in the field.

After earning a PhD in philosophy at Stanford in 1983, Nissenbaum
helped develop and teach a computer ethics course in the late '80s
in Stanford's symbolic systems program, which took an
interdisciplinary approach to computer science, philosophy and
linguistics. At Princeton since 1991, she first taught that course
here last year.

_Computers, Ethics and Social Values_ addresses a persistent
problem Nissen-baum encountered each time she has prepared to
teach the course. Working in a field too immature to have produced
any textbooks, she found herself chasing down sources in numerous
disciplines to compile reading lists for her students. ``One has
to look all over for relevant material--sociology, law,
philosophy and in the professional computing literature,'' she
says. ``Each time I taught the course, I found myself putting
together a huge collection. So we decided to publish this book.''