MSc-IT Study Material
June 2010 Edition

Computer Science Department, University of Cape Town
| MIT Notes Home | Edition Home |

Diffusion of Responsibility

In creating a computer system appreciate the following:

Example: THERAC-25

THERAC-25 is a computer controlled system that gives radiation treatments to patients. Several patients received massive doses, resulting in at least three fatalities. It was difficult to discover who was responsible, but Therac was liable and paid compensation to the victims’ families. Eventually it was found that the action of the operator has caused the accidents. However, it was also found that if an operator entered incorrect mode, noticed this and corrected the error (within 8 seconds as specified), the system still went wrong and the patient was killed as a result. Thus while the operator caused the accident they were not to blame.

The error was traced to also exist in previous version of the software, but had caused no problems because there was only one mode in that version. Was it the designer or the tester (who looks for errors and unconformities to the requirements) who was at fault? The Therac case also illustrates the difficulty of testing real-time systems. Should there have been a feature on the system limiting the radiation dose to some MAXIMUM irrespective of everything else. If so, should this not have been specified by the clients? Are they also partially to blame?

Example: ISP Responsibility

Recall the scenario with Milo, a service provider provides him with chat rooms and forums facilities. Milo was defamed by another user and has the right of recourse. The obvious route to deal with this is the hold the individual responsible – the problem is that the individual is anonymous.

Stratton versus ISP Prodigy (1995)

Prodigy was sued by users with respect to content in its online forums which Prodigy has advertised that it has editorial control over. The court has applied the law governing newspapers and found Prodigy guilty. However, Prodigy claimed that they were like a telephone company and not newspaper company. The following year, the US Communications Decency Act reversed this on the basis that ISPs are closer to telephone companies. The act also introduced the Good Samaritan immunity in which an entity can exercise control without liability and is encouraged to do so.

Example: Virtual Action

Consider the first scenario involving rape in cyberspace. Bungle raped no real person but has upset the other participants. However, he did abuse a new form of expression to expose other people to pornography and violence without their consent. He violated an unwritten rule of the MUD game – sort of like stealing money in a game of Monopoly.

It is tempting to think that because it is a game, his behaviour is not real. This thinking maybe appropriate when considering flight simulators but when the actions involved real people, one should consider

  • Consequences (financial, physical and psychological) to others.

  • One’s actions have effects on others.

  • That one can be mediated by technology.

  • Can claim that virtual environments are not morally neutral – one’s behaviour in the virtual environment is real.

In effect, Bungle:

  • Violated his role responsibility as a participant in the game.

  • Is causally responsible for participants witnessing violence etc.

The issues of blameworthiness and liability are complex. He has broken rules without seeking consent. While virtual environments can allow people to do real and useful things such as in education and medicine, it can also create complex issues when dealing with responsibility – as has been seen in the virtual rape scenario. Consider a case of a medical doctor performing surgery via remote access. Midway through a complex procedure, the link is lost and the patient dies. Who should be held responsible? How would you start investigation into this issue? The key is to start by identifying role players and their responsibilities, then look into the issue of blameworthiness and liabilities.