In the aftermath of a rash of murders captured on mobile smartphones, and mass shootings of civilians and police officers, debate has focused on assigning blame. Videos from mobile smartphones provide some evidence for fueling such debate over who should be held responsible for any killing of a civilian or police officer. And these discussions most often move into a broader debate over major societal issues, such as institutional racism or mental healthcare, and policy issues, such as gun control. All these debates could be valuable and often constructive, and must take place. However, I seldom, if ever, hear discussions of procedural problems that led to what might be called a ‘information’ or ‘procedural’ disaster – that is, misinformation, or lack of information, or practices, that might have enabled the disaster (the killing) to unfold as it did.
Think back to airline hijackings. These could be viewed broadly, such as around issues of international relations and terrorism, but also, the analysis of these events can focus on procedures at airports and on airlines that can minimize the potential for a hijacking to take place. The changes in information gathered, and the procedures at airports and on planes post-hijacking episodes and post-9/11 are well known, and arguably have had a cumulative impact on reducing risks. But I don’t hear analogous discussions of mass shootings and other killings, even when there is video evidence, however limited, and many eyewitnesses in some cases. Perhaps the analysis of procedures is going on behind the scenes, but unbeknownst to me.
This comes to mind because of earlier research I explored around what we called ‘information disasters’.* We originally defined these disasters around the use of information technologies and telecommunications, such as when the USS Vincennes shot down a domestic Iran Air Flight 655 ascending in the Persian Gulf on 3 July 1987, mistaking it for an Iranian F-14 fighter descending towards the ship.
What most impressed me about the study of such disasters was the meticulous investigation of the unfolding events that led to each disaster. These studies often led to lessons that could be learned, such as practices or procedures could be changed.
This kind of study is not new. Our discussions often referred back to a long history of efforts to investigate accidents involving trains. Every train wreck, for example, is examined in great detail to determine what procedures, technical changes, or training could be implemented to avoid a similar type of disaster not only in the same location, but system wide. Train wrecks still occur, often with horrific consequences, but each incident can lead to changes that make the next incident less likely to occur.
It might well be possible to study these very unique circumstances surrounding each killing or mass shooting with a greater focus on addressing lessons learned about obtaining better and more timely information, or instituting new procedures or practices that would prevent a repeat of the sequence of events that led to particular disasters. One thing we learned from our review of a number of well-known information disasters was that they usually entailed many things going wrong. This does not mean that solutions are hopeless. To the contrary, if some problems can be fixed, many of these disasters might not have occurred.
I certainly would encourage more discussion of these issues, as they might be more successful than focusing on bigger and more long-term changes in society. Apologies if this is blindingly obvious, but I am not seeing the discussion that should be taking place.
Dutton, W. H., MacKenzie, D., Shapiro, S., and Peltu, M. (1995), Computer Power and Human Limits: Learning from IT and Telecommunication Disasters. Policy Research Paper No. 33. Uxbridge: PICT, Brunel University.
Peltu, M., MacKensie, D., Shapiro, S., and Dutton, W. H. (1996), ‘Computer Power and Human Limits’, pp. 177-95 in Dutton, W. H. (ed), Information and Communication Technologies – Visions and Realities. Oxford: Oxford University Press.