Engaging Academia in Cybersecurity Research

Engaging Academia in Cybersecurity Research 

Across most academic fields, researchers are increasingly focused on outreach to relevant practitioner and policy communities. It can sharpen their sense of the key questions but also enable their research to have greater application and impact. In contrast, within the field of cybersecurity, policy and practitioners from governmental, non-governmental organizations (NGOs), like the World Bank, and business and industry are more dominant in the production of research. Academic researchers play a relatively less active role. That said, research on cybersecurity could be greatly enhanced if a larger and more multidisciplinary collection of academic researchers could be engaged to focus on issues of cybersecurity and build collaborative relationships with the policy and practitioner communities. 

Why is this the case, and what could be done to correct it? 

Courtesy Arthur Berger

The Dynamics Limiting Academia’s Role in Cybersecurity

I am but one of a growing set of multidisciplinary researchers with a focus on cybersecurity. The field is clearly engaging some top researchers and scholars from a variety of fields, evidenced by colleagues and centers at prominent universities, a growing number of journals and publications, and a dizzying number of events and conferences on topics within the field. Stellar academics, such as Professor David Clark at MIT, Professor Sadie Creese at Oxford University, and Bruce Schneier, a Fellow at the Berkman Center at Harvard, are strong examples. I would add Gabriella Coleman, a chaired professor at McGill University, and Professor Patrick Burkart at Texas A&M, to the list, even though they might not identify themselves as cybersecurity researchers. Many others could be added.  

Nevertheless, compared with other fields, cybersecurity research appears to be dominated more by the practitioner and policy communities. Cybersecurity is not a discipline but a multidisciplinary field of study. But it remains less multidisciplinary and more anchored within the computer sciences than some related fields, such as Internet studies as one comparator with which I am familiar. A number of possible explanations for the different multidisciplinary balance of this field come to mind. 

First, it is a relatively new field of academic research. It was preceded by studies of computer security, which were more computer science centric as they were more focused on technical advances in security systems. The development of shared computing systems and the Internet in particular, has greatly expanded the range of users and devices linked to computer systems, reaching over 4 billion users in 2020. In many respects, the Internet drove the transition from computer security to cybersecurity research and is therefore understandably young in relation to other academic fields of study. 

Secondly, the concept of cybersecurity carries some of the baggage of its early stages. While the characterisations evoked by concepts are often crude, the term often conjures up images of men in suits employed by large institutions trying to keep young boys out of their systems. My MSU colleague, Ruth Shillair, reminded me of the 1983 movie War Games. It is based around a young hacker getting into the backdoor of a major military computer system in ways that threatened to launch a world war, but which left the audience cheering for the young haker.

Today, big mainframe computers are less central than are the billions of devices in households and business and industry and governments across the world. Malicious users, rather than a child accidentally entering the backdoor of a military complex, are the norm. Yet cybersecurity carries some of this off-putting imagery from its early days into the present. 

Thirdly, it is an incredibly important field of research for which there is great demand. Many rising academics in the field of cybersecurity are snapped up by business, industry and governmental headhunters for lucrative positions rather than by academia. 

These are only a few of many reasons for the relative lack of a stronger multidisciplinary research community. Whatever initiatives might enhance its multidisciplinary make-up might also bring more academics as well as more academic disciplines into the study of cybersecurity. How could this be changed?

What Needs to Be Done?

First, academics involved with research on cybersecurity need to do more to network among themselves. This is somewhat of a chicken and egg problem as when there are relatively few academics in a field it seems less important to network with each other. However, until the field comes together to better define the field and its priorities for research, it is harder for it to flourish. Similarly, there are so many pulls to work with practitioners and the policy communities in this area that academic collaboration may seem like a distraction. It is not, as it is essential for the field to mature as an academic field of study. 

Secondly, the field needs to identify and promote academic research on cybersecurity that address big questions with major implications for policy and practice. On this point, some of the research at Oxford’s Global Cyber Security Capacity Centre (GCSCC) has made a difference for nations across the world. For example, the research demonstrates that nations that have enhanced their cybersecurity capacity building efforts have made a serious improvement in the experiences of their nations’ Internet users.[1] But this work is one of many examples of work that is meeting needs in this new area of technological and organizational advances. 

Thirdly, national governments need to place a greater priority on building this field of academia along with building their own cybersecurity capacities. Arguably, in the long run, a stronger academic field in cybersecurity will help nations advance cybersecurity capacity, such as by creating a larger pool of expertise and thought leadership in this area. 

This would be possible through a number of initiatives, from simply taking a leadership role in identifying the importance of the field to encouraging the public research councils and other funding bodies to consider the development of grant support for multidisciplinary research on cybersecurity.

For example, the UK’s Economic and Social Research Council (ESRC) generated early funding for what became the Programme on Information and Communication Technologies (PICT). The establishment of PICT helped to draw leading researchers, such as the late Roger Silverstone, into the study of the social aspects of information and communication technologies. Such pump-priming helped put the UK in an early strategic international position in research on the societal aspects of the Internet and related digital media. 

What factors are constraining the more rapid and widespread development of this field? What could be done to accelerate and deepen its development?

There are a host of other issues around whether policy makers and practitioners would value collaboration with academics, given that their time scales and methodologies can be so dramatically different.[2] That is for another blog, but in the interim, I’d value your thoughts on whether you agree on the need and approaches to further develop the multidisciplinary study of cybersecurity within academia.

Notes


[1] See: Creese, S., Shillair, R., Bada, M., Reisdorf, B.C., Roberts, T., and Dutton, W. H. (2019), ‘The Cybersecurity Capacity of Nations’, pp. 165-179 in Graham, M., and Dutton, W. H. (eds), Society and the Internet: How Networks of Information and Communication are Changing our Lives, 2nd Edition. Oxford: Oxford University Press.

[2] My thanks to Caroline Weisser Harris for suggesting a focus on this question of why practitioners and policy makers might or might not value collaboration with academia.