Learning from Disasters: 30 Years After the USS Vincennes

Thirty years ago, on 3 July 1988, the USS Vincennes shot down an ascending domestic airliner, Iran Air Flight 655, mistaking it for a military aircraft descending toward the aircraft carrier group. My colleagues and I group this information disaster with a number of others with the hope of learning lessons from such incidents. Time has passed since our 1995 paper and the forum it was based on, but I call your attention to this again as it illustrates the need to study and learn from mistakes. Our analysis of these information disasters is available online is entitled Computer Power and Human Limits: Learning from IT and Telecommunication Disasters, and is available here: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3103433 

A revised and published version of this paper is available: Peltu, M., MacKenzie, D., Shapiro, S., and Dutton, W. H. (1996), ‘Computer Power and Human Limits,’ in Dutton. W. H. (ed.), Information and Communication Technologies – Visions and Realities, Oxford and New York: Oxford University Press, 177-195.

Killings Can Be Information (or Procedural) Disasters

In the aftermath of a rash of murders captured on mobile smartphones, and mass shootings of civilians and police officers, debate has focused on assigning blame. Videos from mobile smartphones provide some evidence for fueling such debate over who should be held responsible for any killing of a civilian or police officer. And these discussions most often move into a broader debate over major societal issues, such as institutional racism or mental healthcare, and policy issues, such as gun control. All these debates could be valuable and often constructive, and must take place. However, I seldom, if ever, hear discussions of procedural problems that led to what might be called a ‘information’ or ‘procedural’ disaster – that is, misinformation, or lack of information, or practices, that might have enabled the disaster (the killing) to unfold as it did.

Think back to airline hijackings. These could be viewed broadly, such as around issues of international relations and terrorism, but also, the analysis of these events can focus on procedures at airports and on airlines that can minimize the potential for a hijacking to take place. The changes in information gathered, and the procedures at airports and on planes post-hijacking episodes and post-9/11 are well known, and arguably have had a cumulative impact on reducing risks. But I don’t hear analogous discussions of mass shootings and other killings, even when there is video evidence, however limited, and many eyewitnesses in some cases. Perhaps the analysis of procedures is going on behind the scenes, but unbeknownst to me.

This comes to mind because of earlier research I explored around what we called ‘information disasters’.* We originally defined these disasters around the use of information technologies and telecommunications, such as when the USS Vincennes shot down a domestic Iran Air Flight 655 ascending in the Persian Gulf on 3 July 1987, mistaking it for an Iranian F-14 fighter descending towards the ship.

What most impressed me about the study of such disasters was the meticulous investigation of the unfolding events that led to each disaster. These studies often led to lessons that could be learned, such as practices or procedures could be changed.

This kind of study is not new. Our discussions often referred back to a long history of efforts to investigate accidents involving trains. Every train wreck, for example, is examined in great detail to determine what procedures, technical changes, or training could be implemented to avoid a similar type of disaster not only in the same location, but system wide. Train wrecks still occur, often with horrific consequences, but each incident can lead to changes that make the next incident less likely to occur.

http://abcnews.go.com/International/dead-dozens-injured-head-train-collision-italy/story?id=40509997
http://abcnews.go.com/International/dead-dozens-injured-head-train-collision-italy/story?id=40509997

It might well be possible to study these very unique circumstances surrounding each killing or mass shooting with a greater focus on addressing lessons learned about obtaining better and more timely information, or instituting new procedures or practices that would prevent a repeat of the sequence of events that led to particular disasters. One thing we learned from our review of a number of well-known information disasters was that they usually entailed many things going wrong. This does not mean that solutions are hopeless. To the contrary, if some problems can be fixed, many of these disasters might not have occurred.

I certainly would encourage more discussion of these issues, as they might be more successful than focusing on bigger and more long-term changes in society. Apologies if this is blindingly obvious, but I am not seeing the discussion that should be taking place.

*References

Dutton, W. H., MacKenzie, D., Shapiro, S., and Peltu, M. (1995), Computer Power and Human Limits: Learning from IT and Telecommunication Disasters. Policy Research Paper No. 33. Uxbridge: PICT, Brunel University.

Peltu, M., MacKensie, D., Shapiro, S., and Dutton, W. H. (1996), ‘Computer Power and Human Limits’, pp. 177-95 in Dutton, W. H. (ed), Information and Communication Technologies – Visions and Realities. Oxford: Oxford University Press.

Escalators to Disasters: Lessons from the Flint Water Crisis?

Over twenty years ago, my colleagues and I organized a forum on disasters related to information and communication technologies (ICT), where lives may have depended on the safe and effective operation of computer systems (Dutton et al 1994; Peltu et al 1996). As I, along with many many others, try to sort through the facts and timeline of decisions leading to the Flint Water Crisis, I was reminded of one powerful lesson learned from our study of ICT disasters.

Namely, there is a tendency of organizations and their leadership to get on a decision-making escalator. As they make a decision, such as shifting the source of water to the Flint River, organizations are in some ways stepping onto a metaphorical escalator. The further they ride the escalator, the more difficult is the psychological and practical problems of jumping off, even when they consider it a bad decision. This is not an excuse for persisting on the wrong track, but it is a lesson that might be exemplified by the Flint River Crisis.

Of course, another theme that emerged from our discussions of ICT disasters was the degree that each disaster we studied was over-determined. That is, there was seldom one specific, determining reason behind any particular disaster. Most often, disasters occurred as a result of a large number of mistakes, including failures to follow good practice and use common sense, such as trialing a change before going live.

I agree with many who argue for a focus on solving the water problems in Flint, rather than dwelling on the reasons for the disaster. However, study of such crises and failures to get off the escalator can help avoid similar disasters in other cities across the US that are also financially stressed, and dependent on old infrastructure.

B9319216700Z.1_20151010132744_000_GDLC6MSJU.1-0
Demonstration on Flint water contamination

References

Dutton, W., MacKenzie, D., Shapiro, S., and Peltu, M. (1994), Computer Power and Human Limits: Learning from IT and Telecommunication Disasters, PICT Policy Research Paper 33, Uxbridge, UK: Brunel University, Programme on Information and Communication Technologies.

Peltu, M., MacKenzie, D., Shapiro, S., and Dutton, W. H. (1996), ‘Computer Power and Human Limits,’ in Dutton. W. H. (ed.), Information and Communication Technologies – Visions and Realities, Oxford and New York: Oxford University Press, 177-195.

Malaysian Airlines MH17: Studies of Information Disasters

Evidence is only beginning to develop about what led to the disaster that beset Malaysian Airlines Flight MH17 over the Eastern Ukraine. However, it is likely to be compared with other military and large technical system disasters, such as when the USS Vincennes accidentally shot down a domestic Iranian Airline, Iran Flight 655 on 3 July 1987. These have been called ‘information disasters’ by myself and colleagues, who have looked at studies of this and other related cases. See our chapter: Peltu, M., MacKenzie, D., Shapiro, S., and Dutton, W. H. (1996), ‘Computer Power and Human Limits,’ in Dutton. W. H. (ed.), Information and Communication Technologies – Visions and Realities, Oxford and New York: Oxford University Press, 177-195. Specific treatment of the USS Vincennes is provided by Rochlin, G. (1991), ‘Iran Air Flight 655 and the USS Vincennes: Complex, Large-Scale Military Systems and the Failure of Control’, pp. 99-125 in La Porte, T. (ed.), Social Responses to Large Technical Systems. Dordrecht: Kluwer Academic Publishers.

In the case of MH17, there seems to be mounting evidence that it was shot down by mistake. A domestic airliner was not the intended target. However, debate is huge over who shot the plane down, and who supplied the weapons. Needless to say, the analysis of such cases often deals with more than the specific information disaster – the mistake, such as in the earlier case: Why did the domestic Iran Flight 655 come to be perceived as a military aircraft descending toward the USS Vincennes, when it was actually climbing? In this respect, such studies do not always deal adequately with the broader political and military issues over responsibility. These broader questions have been the primary and immediate focus of debate over MH17. Rather than understand why MH17 was shot down, people worldwide are wondering who was responsible for putting particular weapons into the hands of the Russian separatists who are widely suspected of firing the missile that took down MH17.* But academics can and should devote their own talents to see if lessons can be learned from such disasters at any level of analysis.

*See the Financial Times: http://www.ft.com/cms/s/0/a1dcc628-1010-11e4-90c7-00144feabdc0.html#axzz386tsBcsR