In the preface to Harold D. Lasswell’s (1936: v) book entitled Politics, he famously defined politics as the study of ‘who gets what, when, and how’ – also the subtitle of his book. He went on to argue that influence is central to politics and that “[c]oncepts for the study of influence must be changed or invented when influence is sought by novel means or under changed circumstances.”
The second world war was an occasion for the since legendary American political scientist and communication scholar, Lasswell, and other leading sociologists and political scientists to focus their research on the study of influence, initially in response to the Nazi Party’s then novel use of radio as a central media for propaganda. In the twenty-first century, the internet, social media, and related digital technologies have become channels for new approaches to influence, such as in the use of computational propaganda (Woolley and Howard 2018). Since the Russian invasion of Ukraine, Russia’s military efforts have been accompanied by concerted efforts to exploit all media – new and old – to support their influence operations through war propaganda. Have these Russian influence operations changed the conditions surrounding the reception of information and communication, and its influence, worldwide?
In many ways, the propaganda of the second world war provides insights to Russian propaganda about its 2022 invasion. There are many points of continuity and change from the 1930s to 2020s, which I will discuss in a separate piece on propaganda. However, one remarkable change seems to be a move from focusing on changing agendas, attitudes, and opinions of the public to changing the public’s beliefs – beliefs about the invasion and the roles of different actors. Instead, they seem to be focusing on shaping what is perceived to be the facts and assumptions underpinning the case – the truth. A NATO (2023) report refers to this focus as ‘cognitive warfare’, which the authors defined as “activities conducted in synchronization with other instruments of power, to affect attitudes and behaviours by influencing, protecting, and/or disrupting individual and group cognitions to gain an advantage”.
NATO’s notion of cognitive warfare captures what may be a subtle yet transformative shift in the information and communication strategies followed by key actors to influence other actors, from a nation’s citizens to the public and elites of other nations. For example, rather than develop the case for or against its invasion, the government of Russia sought to define its actions as not an invasion or a war but a ‘special military operation’ (SMO). Well over a year after Russia continued to promote this SMO narrative, drone strikes on Mosco have brought the facts of war home to the residents of Moscow, in ways similar to social media documentation of war crimes. These are part of the dynamics of cognitive politics or warfare.
This shift toward shaping cognitions resonates well with many communication strategies of leadership in the Kremlin with respect to its SMO, but also to discussions of the ways in which communication in toxic domestic political settings across the world appears to be increasingly challenging the truth from a variety of perspectives. As this pattern – if valid – seems to apply to some cases of domestic as well as international politics as well as to war propaganda, it might be useful to refer to this more generally as ‘cognitive politics’. But the war propaganda surrounding the Russian invasion might well provide an extreme case study of how this can be done and with what consequences.
Cognitions as a Target versus an Explanation of Attitude Change
Over decades, sociological, psychological, and neuroscience models have provided explanations for why people adopt and change their attitudes or opinions about politicians, political parties, issues, or symbols. Theories of selective perception, confirmation biases, and spirals of silence are among these social-psychological explanations. These remain important to theories or models of attitude change, but the role of cognitive processes in attitude change is different from cognitions being the target of propaganda or influence operations.
A famous example is the notion of 2 + 2 = 5. George Orwell (1949) used this mathematically incorrect expression in his dystopian novel Nineteen Eighty-Four. If the interrogator, O’Brien, could influence Winston Smith to eventually believe that this mathematically incorrect statement is true, when it is self-evidently false, then O’Brien would have succeeded in his efforts in Room 101 to gain Smith’s compliance.
Orwell had worked for the BBC during the second world war and used this 2 + 2 = 5 anti-intellectual meme in an earlier essay to counter Nazi propaganda, which routinely denied developments that were self-evidently true. But the use of this expression of 2 + 2 = 5 predated Orwell, such as in a collection of stories by Alphonse Allais, entitled Two Plus Two Make Five published in 1895. So is cognitive politics not new at all but being resurrected in the twenty-first century.
The Kremlin calling the invasion of Ukraine a ‘Special Military Operation’ (SMO) is a good example. Elites and citizens in Russia were not permitted to call it a war even though it is self-evidently a war to much of the rest of the world. The SMO is a political truth rather than a self-evident truth.
Challenging Any Truth
That said, O’Brien, the interrogator, went beyond convincing Winston Smith that 2 + 2 = 5 to argue that it could be three, four, or five, or any answer. There was no single truth. The rise of social media and information wars in domestic and international arenas have recovered the significance of operations that fill public communication with multiple perspectives on the truth. Troll farms and bots deluging internet users with overwhelming amounts of often contradictory information can confuse the reader – feeling that they cannot trust anyone or any answer (Pomerantsev 2015; 2019). Instead of the internet and social media overcoming a lack of information, the use of computational propaganda in orchestrated and often decentralized information operations can leave the public even less informed and less trusting of what they are being told (Woolley and Howard 2018).
The Need for Case Studies: Grounding Notions of Cognitive Politics
General concern over information or cyber warfare is genuine, but its actual use and effects can be under- or over-estimated. Much hype surrounds information wars but that does not mean they are unimportant. There does seem to be a war on information or the use of information such as in propaganda to fight domestic, partisan, and international battles. Many questions arise around this concept of cognitive politics. Consider some of the following:
- Is war propaganda reflecting continuity over the decades or are their new aspects beyond the new media? Has cognitive politics become more central to propaganda?
- How have new media like the internet and related digital media changed the production or reception of propaganda and influence campaigns?
- Are notions of misinformation and disinformation too narrow? Getting the facts right seems to miss efforts to make psychological appeals to patriotism, nationalism, and ‘us versus them’ frameworks.
- Are old models, such as the spiral of silence or the confirmation bias, more rapidly and effectively implemented in the online context? Do they provide useful models for the study of cognitive politics?
- How is social engineering, such as the re-education and socialization of children- clearly a form of cognitive politics, designed in relation to more information and media campaigns?
- Most importantly, how can cognitive politics be identified and countered more effectively? What awareness raising, education, or technical initiatives might be useful in countering cognitive politics? Alternatively, must everyone engage more explicitly in cognitive politics?
- Will the internet and social media enable Fifth Estate (Dutton 2023) efforts to hold political truths to account – making actual developments more self-evident?
As Lasswell (1936: v) argued, a renewed focus on the study of influence is a valuable response “when influence is sought by novel means or under changed circumstances.” The information war over Ukraine is just such a moment.
I am working with colleagues on a case study of the use of all media by actors in the Russian invasion of Ukraine. With some initial support from the Portulans Institute, and collaboration with colleagues in centres at Oxford and other universities, we plan to use the influence operations around the Ukraine war to provide insights to propaganda in the digital age. See our project page on the Ukraine Case Studies at: https://portulansinstitute.org/case-studies/
Dutton, W. H. (2023), The Fifth Estate: The Power Shift of the Digital Age. New York: Oxford University Press.
Lasswell, H. D. (1936), Politics: Who Gets What, When, and How. York, Pennsylvania: McGraw-Hill Book Company.
NATO (2023), North Atlantic Treaty Organization, ‘Cognitive Warfare: Strengthening and Defending the Mind’, ACT, 5 April: https://www.act.nato.int/articles/cognitive-warfare-strengthening-and-defending-mind
Orwell, George (1949), Nineteen Eighty-Four. Harcourt Brace and Company, Inc. Pomerantsev, Peter (2015), Nothing is True and Everything is Possible. London: Faber & Faber.
Pomerantsev, Peter (2015), Nothing is True and Everything is Possible. London: Faber & Faber.
Pomerantsev, Peter (2019), This is Not Propaganda: Adventures in the War Against Reality. London: Faber & Faber.
Woolley, Samuel C., and Howard, Philip N. (2018) (eds), Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. New York: Oxford University Press.