• No results found

Resilience Against the Dark Arts : A Comparative Study of British and Swedish Government Strategies Combatting Disinformation

N/A
N/A
Protected

Academic year: 2021

Share "Resilience Against the Dark Arts : A Comparative Study of British and Swedish Government Strategies Combatting Disinformation"

Copied!
62
0
0

Loading.... (view fulltext now)

Full text

(1)

Peace and Conflict Studies III Bachelor Thesis

12.0 hp Spring 2021

Supervisor: Kristian Steiner Word count: 12606

RESILIENCE AGAINST THE DARK ARTS

A COMPARATIVE STUDY OF BRITISH AND SWEDISH GOVERNMENT

STRATEGIES COMBATTING DISINFORMATION

(2)

Abstract

Western liberal democracies currently face a significant challenge from the growing proliferation of disinformation. With research suggesting that disinformation increases the risk of violence and intergroup conflict, this thesis sought to understand precisely what is being done by states to decrease the likelihood of this happening—specifically, with how the United Kingdom compares with/differs from Sweden in the type of resilience strategies employed to combat disinformation. To answer this question, this thesis conducted a qualitative comparative content analysis to examine government communications for the purposes of identifying, codifying, and describing the different types of resilience strategies combatting disinformation as practised by the United Kingdom and Sweden, to serve as a repository aid in future intervention planning. Utilising a bespoke analytical framework to make sense of resilience strategies of differing scales, a micro-macro perspective was adopted to capture (1) bottom-up focused strategies—which sought to enhance an individual’s ability to independently evaluate the accuracy of the information that they consume and (2) top-down focused strategies—which sought to reduce societal disinformation exposure through structural interventions. This thesis demonstrates that the United Kingdom and Sweden share approximately two-thirds of their disinformation resilience strategy with one another. From 472 items sourced from British and Swedish government communications, this study uncovered 15 micro strategies and 59 macro strategies in total—which, at face value, suggests a genuine bias in favour of a macro strategic resilience approach. To the degree that this is suitable for effective societal resilience against disinformation, remains inconclusive and warrants further research.

Keywords: Disinformation, Resilience, Qualitative, Comparative, Content

(3)

List of Tables

Table 1 Logic of a Micro Strategy... 13

Table 2 Logic of a Macro Strategy... 14

Table 3 Content Selection Rules... 17

Table 4 Strategy Identification Rules... 17

Table 5 Example of a Macro Strategy... 18

Table 6 British Media and Information Literacy sub-strategies... 20

Table 7 Swedish Media and Information Literacy sub-strategies... 21

Table 8 British Digital Educational Material sub-strategies... 22

Table 9 Swedish Digital Educational Material sub-strategies... 23

Table 10 British Independent Journalism and Media sub-strategies... 24

Table 11 Swedish Independent Journalism and Media sub-strategies... 26

Table 12 British Strategic Communication sub-strategies... 27

Table 13 Swedish Strategic Communication sub-strategies... 30

Table 14 British Situational Awareness sub-strategies... 31

Table 15 Swedish Situational Awareness sub-strategies... 31

Table 16 British Civil Society sub-strategies... 32

Table 17 Swedish Civil Society sub-strategies... 33

Table 18 British Technological sub-strategies... 34

Table 19 Swedish Psychological Defence sub-strategies... 35

Table 20 British Legislative sub-strategies... 36

Table 21 Components of the Micro Strategy... 37

(4)

Table of Contents

1. Problematisation ... 1

1.1. Introduction ... 1

1.2. Purpose and Problem Formulation ... 1

2. Literature Review ... 3

2.1. The Contested Information Environment ... 3

2.2. Disinformation, Definitions and Epistemology ... 7

2.3. Previous Peace Research on Disinformation ... 8

3. Analytical Framework ... 10 3.1. Introduction to Resilience ... 10 3.2. Reflections on Resilience ... 12 3.3. Resilience Framework ... 12 4. Methodology ... 15 4.1. Design ... 15 4.2. Source Discussion ... 15

4.3. Data Collection Process ... 16

4.4. Analytical Discussion ... 19 4.5. Reliability ... 19 5. Analysis ... 20 5.1. Micro Strategies ... 20 5.2. Macro Strategies ... 24 5.3. Discussion ... 37 6. Conclusion... 39 References ... 40 Literature ... 40 Data collection ... 51

(5)

1. Problematisation

“War today is in the process of undergoing another evolution in response to social and political conditions, namely the speed and interconnectivity associated with contemporary globalisation and the information revolution.”

—Emile Simpson, War from the Ground Up

1.1. Introduction

The use of information as an instrument of power, both in times of war and peace is as old as civilisation itself, but with over half the world’s population now connected to an instantly responsive global communications network, a considerable number of novel challenges have emerged concerning the rise of subversive influences (World Economic Forum, 2021, pp. 53– 54). Governments, political campaigns, corporations and regular citizens from all over the world are increasingly utilising human resources and digital technologies to carry out large scale manipulation in a deliberate attempt to shape social life (Bradshaw et al., 2020; Woolley & Howard, 2019b). Of particular note, are sophisticated information campaigns that precede and augment military operations, deliberately targeting civil society with misleading content in support of political objectives—as witnessed during the Russia-Ukraine conflict in 2014 (Mejias & Vokuev, 2017). This sort of informational weaponry is commonly referred to as

disinformation—deliberately misleading information that has been fabricated to exploit a target

audience to attain some vested interest (Lanoszka, 2019). What is more, research suggests that disinformation increases the risk of intergroup conflict and violence (Arayankalam & Krishnan, 2021; Ward & Beyer, 2018). To make matters more complex, state actors are increasingly outsourcing disinformation activities to a multitude of third-party actors (e.g. private strategic communications firms), to reap the advantages of plausible deniability (Brookings, 2021). This erosion of the distinction between the public and private concerning the attribution of disinformation further complicates conventional understandings of war and peace. In light of these developments, European states are now taking political action to raise awareness and strengthen societal resilience against disinformation (European Commission, 2018, p. 5).

1.2. Purpose and Problem Formulation

One of the primary goals of conflict resolution is to get in front of a potential conflict before it begins to “increase the range of situations where violence is not a possibility” (Ramsbotham et al., 2017b, p. 146). In consideration of disinformation’s proliferation and the evidence

(6)

suggesting an increased risk of intergroup conflict and violence, this thesis seeks to understand what strategies are being employed by state actors to mitigate the likelihood of violence arising from disinformation, for the benefit of future intervention planning.

The United Kingdom (UK) and Sweden both offer promise as two worthwhile cases to study for the purposes of this task for several reasons. Firstly, the UK and Sweden are two state actors who have been targeted by state-sanctioned disinformation attacks (DIIS, 2020; Internetkunskap, 2021). Secondly, these two states have taken deliberate measures to safeguard their societies against disinformation (MSB, 2019; UK Government Communication Service, 2019). Lastly, the researcher is acquainted with the languages and national security strategies of these two states—which, in principle, ought to lead to more compelling and productive work on behalf of the researcher.

The ambition of this study is not to provide an exhaustive account of all counter-disinformation strategies that are presently employed within the UK and Sweden, but rather, to illuminate the different dimensions of strategy that are being employed to combat disinformation as identified within government communications. By analysing the various types of strategies that are being employed by the British and Swedish governments to strengthen societal resilience against disinformation, I aim to illuminate tried and tested practises to serve as a repository aid in future intervention planning.

To accomplish these aims, this thesis seeks to answer the following research question:

1. How does the United Kingdom compare with/differ from Sweden in the type of

disinformation resilience strategy it employs?

To answer this research question, the following sub-questions will be analysed:

I. What strategies have the United Kingdom and Sweden employed to strengthen the

individual’s ability to evaluate the information they are exposed to?

II. What strategies have the United Kingdom and Sweden employed to reduce societal

(7)

2. Literature Review

Our diffuse and complex digital information ecology makes the study of disinformation a highly challenging task for any researcher or analyst. Therefore, the following literature review aims to synthesise perspectives from across a wide variety of fields for the purpose of clarifying the wider social and technological backdrop that underpins subversive communications and the emergence of disinformation as a contemporary security threat. Additionally, the conceptual and epistemological debate around the term ‘disinformation’ will be discussed, before concluding with a discussion concerning previous research about disinformation within the field of Peace and Conflict Studies (PACS).

2.1. The Contested Information Environment

Kornienko et al., (2015) claim that the development of our modern communicative information society has transformed the very nature of global power relations. Traditional and non-traditional actors alike are becoming increasingly empowered by emerging information technologies to influence rules, institutions and social outcomes (Padovani & Pavan, 2011, pp. 553–554). Furthermore, research shows that greater access to information, social networks, and collective opportunities prove to be significant enablers for increased civic engagement and political activity (Vaccari, 2017). Not least with formerly underprivileged groups who can now participate within political affairs at levels previously unimaginable (Singh, 2002, 2013). What is more, the traditional gatekeepers and information brokers of the twentieth century no longer hold the same influence over their targeted audiences now that knowledge curation has effectively become ‘democratised’ (Hussain, 2012; Waltzman, 2017, p. 2). Simply put, a growing multiplicity of actors now have the opportunity to communicate to far-reaching audiences all over the globe at extremely low costs—which includes the possibilities of enhanced autonomy, anonymity and the chance to be whoever, whatever and wherever one wishes (Vartapetiance & Gillam, 2014).

Following these substantial social changes are profound structural transformations across the global information environment. In 2018, the International Data Corporation (IDC) released a report predicting that the global datasphere is predicted to increase fivefold over the next 7 years—from 33 zettabytes (ZB) in 2018 to 175 ZB in 2025 (Rainsel et al., 2018). This surge in data growth follows the rising number of people and devices connecting to the internet each year, with projections estimating that 5.3 billion people will be online in 2023, up from the 3.9 billion people that were reportedly online in 2018 (Cisco, 2020, p. 5). Likewise, networked

(8)

devices are predicted to reach 29.3 billion globally in 2023, up from the 18.4 billion networked devices in 2018 (Cisco, 2020, p. 29). This amplification of global communications activity has amassed alongside the proliferation of algorithms and bots, which is profoundly altering human information exposure and consumption habits due to enhanced automation of content delivery (Sarts, 2021; Woolley & Howard, 2019b). As the abundance of information intensifies, some scholars have noted how peoples’ motivation and ability to attentively process information diminishes (Petty & Cacioppo, 1986, p. 128). In the words of Weng et al., “the abundance of information to which we are exposed through online social networks and other socio-technical systems is exceeding our capacity to consume it” (Weng et al., 2012, p. 1). It is against this background that Lin (2019) observes the influence of subversive influences competing to capture our scarce attention within an increasingly vast and contested information environment. Each and every day, people the world over are confronted with more and more information that is designed to bolster or challenge previously held beliefs and values (Taber & Lodge, 2006). Jack (2017) highlights the many different forms of subversive influences that are now targeting audiences to achieve these ends—like advertising, public relations and information operations.

Of grave concern here and the focus of this thesis is the growing prevalence of actors who are deliberately misrepresenting information to achieve their desired aims (Woolley & Howard, 2019b). A concern that is shared by many people throughout the world. A comprehensive 2020 survey found that 56% of their sampled respondents (spanning across 40 countries worldwide) are now concerned over the accuracy of the news that they read online (Newman et al., 2020). Elsewhere, some of the world’s most powerful states now openly acknowledge that information technology is being harnessed by a wide range of actors to target and manipulate the perceptions of their domestic populations for political ends (FMPRC, 2020; Library of Congress, 2020; Official Journal of the European Union, 2018). Moreover, mounting evidence suggests that disinformation is becoming more and more prevalent within election interference efforts globally (Baines & Jones, 2018; Freedom House, 2017). According to Whyte and Etudo (2020, p. 125) the bulk of this effort is accomplished via direct audience engagement—using trolls, bots and an array of ‘useful idiots’ to disseminate fabricated information. Simply put, the ever-growing global information environment has given rise to an increased ‘attack surface’ of vulnerabilities that are open to exploitation from a multiplicity of actors (Whyte et al., 2020, p. 3). As Dr Kello puts it, it is now possible to cause “significant harm to a nation’s political, economic and social life without firing a single gun” (Select Committee on International Relations, 2018, p. 29).

(9)

One notable case that illuminates the grave implications of disinformation would be Russia’s sustained information campaign during the Russia-Ukraine conflict in 2014. Khaldarova and Pantti (2016, pp. 3–4) describe how Russian state-owned media outlets were used by the government to disseminate ‘strategic narratives’ throughout the Ukrainian crisis to shape the perceptions and actions of the domestic population and compatriots in Russia’s near abroad. One such narrative centred on a rising existential fascist threat in Ukraine that leaned heavily on pre-existing enemy symbolism from World War II. Other narratives composed of strong identity and ethnic themes, like a ‘Russian Slavic Orthodox Civilization’ standing in defiant resistance to a ‘decadent’ Europe (NATO StratCom, 2015, p. 4). Such influence tactics are said to prey upon pre-existing social divisions and anxieties using deliberately misleading information to sow distrust and division amongst communities—so as to divide and control the targeted audience (Karlsen, 2019). A more evident effect of disinformation during the Russia-Ukraine conflict was demonstrated in February 2014, when unmarked Russian military units surfaced across Ukraine’s Crimean Peninsula amidst the political turmoil—seizing airfields, key administrative buildings and other strategic points, including Crimea's parliament building, which would promptly facilitate the snap referendum that would bring about Russia’s annexation of Crimea (Grant, 2015). Responding to these unfolding developments at the time, Russian state-owned media espoused what would soon be understood as another government ‘strategic narrative’—describing the unmarked armed units as “similarly dressed and equipped to the local ethnic Russian ‘self-defence squads’,” (The Guardian, 2014). Likewise, responding to direct questioning on the origins of these units in Crimea, Russian president, Vladimir Putin, made the state’s position firmly clear: “those were local self-defence units” (Kremlin Russia, 2014). It would not be until almost one year after Russia’s annexation of Crimea, that Vladimir Putin would publicly lay bare Russia’s role in Crimea:

“In order to block and disarm 20,000 well-armed [Ukrainian soldiers], you need a specific set of personnel. And not just in numbers, but with skill. We needed specialists who know how to do it”

“That’s why I gave orders to the Defense Ministry -- why hide it? -- to deploy special forces of the GRU (military intelligence) as well as marines and commandos there under the guise of reinforcing security for our military facilities in Crimea” (RFE/RL, 2019).

By partaking in informational manoeuvring of this sort to create ambiguity and produce bandwidth challenges for onlooking states, actors can remain beneath the threshold of war and bypass potential military confrontation with other states to achieve favourable political outcomes (Deibler, 2020, p. 136). This is arguably why a number of states now seek to achieve

(10)

‘discourse dominance’ or ‘information advantage’ in pursuit of political advantage through the shaping of audience perceptions (Kania, 2020; UK Ministry of Defence, 2018). Moreover, because of this rising competitive pressure within the global information environment, states are co-evolving and updating their national security strategies to compete accordingly, with inevitable cascading consequences (Ruhmann & Bernhardt, 2019). Copious military doctrines now reveal how this information contest stretches deep into the civilian realm (UK Ministry of Defence, 2018, pp. 1–2; U.S. Department of Defense, 2016, p. 2). Which is eroding distinctions between military and civilian—war and peace—and perhaps amounting to what Dr Kello refers to as “unpeace” (Select Committee on International Relations, 2018, p. 29). It is no exaggeration to say that state coordinated information operations now extend well beyond conflict zones and deep into the heart of public life.

The growing prevalence of subversive influences that result from sophisticated strategic manipulation are now widely studied across a wide array of disciplines—from international relations (Wohlforth, 2020), journalism studies (Waisbord, 2018) media and communications (Taddicken & Wolff, 2020) to security studies (Carter & Carter, 2021). Likewise, there are a number of different terms that are being used to adequately capture and describe the challenges emerging from the prevalence of false information in our digitalised communications society— like computational propaganda (Woolley & Howard, 2019a), disinformation (Lanoszka, 2019), fake news (Vargo et al., 2018), information disorder (Wardle & Derakhshan, 2017),

information warfare (Klein, 2018), misinformation (Vraga & Bode, 2020), political warfare

(Smith, 1989), post-truth (Haack, 2019) and truth decay (Kavanagh & Rich, 2018).

Owing to the far-reaching nature of this phenomenon and the copious terms being used to describe the driving forces behind it, it is necessary to alleviate the conceptual saturation around this study’s focus area to justify the appropriate terminology for the purposes of this research. As mentioned earlier, this thesis is exclusively concerned with the type of information that contains deliberately misleading elements—that is, information that is misleading by intent, rather than by accident. There are a number of terms that are used interchangeably to describe misleading information that is spread under the guise of factual information—these are ‘propaganda’, ‘misinformation’, and ‘disinformation’ (Guess & Lyons, 2020). The term propaganda remains vague and ill-defined, however, whilst most scholars accept that it constitutes a deliberate process to persuade an audience to further some agenda, albeit at times, unethically, many would agree that propaganda still has the capacity to be truthful (Guess &

(11)

unequivocally refer to inaccurate information, with the underlying intentions of the message being the key differentiator between the two (Wu et al., 2019). Quite simply, misinformation can be an unintentional affair, where individuals with benign motivations spread inaccurate information within their networks because they genuinely believe it to be true, despite facts to the contrary. In contrast, disinformation is intentionally misleading, which makes this concept the primary focus for the purposes of this thesis’s investigation.

2.2. Disinformation, Definitions and Epistemology

Fallis (2015) defines disinformation as “misleading information that has the function of misleading” (Fallis, 2015, p. 413). This definition shares parallels with the Swedish understanding of disinformation, which defers to the Swedish Civil Contingencies Agency (MSB) definition: “disinformation refers to incorrect or manipulated information that is deliberately disseminated for the purpose of misleading” (MSB, 2018, p. 25). However, in the strictest sense, these definitions of disinformation would capture harmless things, like jokes and satire, which function to mislead their targeted audience, albeit temporarily, to achieve humorous ends. But as some scholars point out, satire and humour are typically absent from accusations of disinformation on account of the short-lived nature of the deception and the humorous intentions of the communicator (Lanoszka, 2019, p. 3; Meinert et al., 2018, p. 486). The implication of this suggests the need to probe deeper into the nature of the intentions underpinning the message.

Wardle and Derakhshan (2017) provide a definition for disinformation that accounts for the

hostile intentions that can underpin a message: “information that is false and deliberately

created to harm a person, social group, organization or country” (Wardle & Derakhshan, 2017, p. 20). This definition would exclude lighthearted things like jokes, satire, and sarcasm—but it would also exclude things like marketing campaigns that seek to deliberately mislead audiences for financial gain. For example, like a guerrilla marketing campaign that seeks to influence the audience’s purchasing behaviour using deliberately misleading information (Rtec Auto Design, 2016). Or a public relations company that is paid to pump out misleading positive narratives for a would-be political candidate during election time (Buzzfeed News, 2020). In both these cases, one would find it extremely difficult to demonstrate harmful intentions, however, misleading schemes of this sort are still arguably counterproductive to preserving societies’ ability to communicate in a peaceful and legitimate manner. In contrast, if we examine how the UK’s Government Communication Service defines disinformation, we find a much more comprehensive definition of disinformation that captures a wider spectrum of interests that may

(12)

underpin disinformation campaigns: “disinformation is the deliberate creation and dissemination of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain” (UK Government Communication Service, 2019, p. 6).

No less important than unpacking disinformation’s conceptual variance is also the need to clarify what exactly is required to make an allegation of disinformation in the first place. In other words, how does one know disinformation when they see it? Afterall, it is all too easy to assume that disinformation is merely related to some objective truth—which it is if it is thought of in strictly rational terms. In practise, however, how does one measure the intentions behind the message? Or measure the truthfulness of a claim? Particularly in political matters where knowledge is continually contested and in flux? With respect to the latter point, Vraga and Bode (2020) provide an invaluable assessment of the epistemological considerations that underpin the confidence behind misinformation allegations, particularly in proving that information is inaccurate—a criterion that is equally found within the concept of disinformation. Vraga and Bode’s research demonstrates that a consensus among experts and the levels of evidence available within the information environment are the key determining factors that permit allegations pertaining to an information’s inaccuracy. Which raises some significant questions, such as, who qualifies as an expert? What level of agreement must experts satisfy to reach expert consensus? What qualifies as the best evidence for emerging issues? Circling back to the first epistemological challenge concerning disinformation, that is, how can one practically determine the intentions underpinning information? Wanless and Pamment (2019, p. 4) highlight the inherent immeasurability of such a requirement, particularly at scale, which proves to be extremely problematic for researchers and analysts looking to capture instances of disinformation, in addition to the difficulties in discerning between misinformation and disinformation respectively. This conundrum was even acknowledged within a recent Swedish government bill, which recognised how external information influence campaigns can be extremely difficult to separate from legitimate domestic opinion (Försvarsdepartementet, 2020, p. 61). The epistemological challenges outlined here serve as a warning to conflict scholars and analysts that a cautionary approach to their methods in the study of disinformation is vital.

2.3. Previous Peace Research on Disinformation

(13)

(Ramsbotham et al., 2017a, p. 425). Mass communications are a significant area of study for scholars of PACS on account of its influence over how societies organise themselves (Ramsbotham et al., 2017a, p. 420). The impact of mass communications upon successful conflict resolution is typically expressed as a double-edged sword. On the one hand, communications inform, educate and unite people—but on the other hand, they also mislead, divide and turn people against one another (Gallacher et al., 2021). The latter of which so naturally occurs amid the circulation of disinformation (Ward & Beyer, 2018). Not least since information distortion via disinformation can lead to dangerous misperceptions about another actor’s intentions—which may lead to devastating consequences in the event of a miscalculation (Dreze, 2000, p. 1177).

The study of malign communications within PACS generally falls under the purview of ‘Peace Journalism’ (Lynch & McGoldrick, 2007). In the 2007 Handbook of Peace and Conflict

Studies, disinformation is briefly mentioned as a distinct military technique that plays a

function in the shaping of misleading news representations about a conflict (Lynch & McGoldrick, 2007, pp. 248–249). However, disinformation’s significance within Peace Journalism is typically regarded as a subsidiary element that shapes news representations of war and not of particular significance in and of itself, much less during peace time. Perhaps this is why there appears to be a distinct absence of research about the topic of disinformation within the prevailing PACS scholarship. The Journal of Peace Research (JPR), Journal of

Conflict Resolution (JCR) and the journal of Conflict Management and Peace Science (CMPS)

hold zero publications that contain the word ‘disinformation’ within the title of their works. The Peace Review journal returned one publication from 1993 titled ‘DisInformation,

DatInformation’, which features a five page article that discusses U.S. CIA disinformation

campaigns in South America during the Cold War (Sharkey, 1993). Quite simply, there is a distinct lack of research and debate within PACS on the phenomenon of disinformation as a discernible risk to intergroup conflict and violence. In view of this aperture within the field of PACS, this thesis seeks to provide a modest offering that describes functioning resilience strategies against disinformation to encourage further debate and contribute towards future intervention planning.

(14)

3. Analytical Framework

Here this study presents a bespoke analytical framework for helping us understand the types of strategies that are currently being employed by the United Kingdom and Sweden to reduce the threat of disinformation in consideration of the increased risk of violence and intergroup conflict that disinformation poses (Arayankalam & Krishnan, 2021; Ward & Beyer, 2018). The framework is developed from insights derived from literary works and academic research on the subject of resilience. In particular, this thesis draws on insights that will enable this study to adequately make sense of the forthcoming resilience strategies for the purposes of classification.

3.1. Introduction to Resilience

As a form of non-kinetic attack that does not quite meet conventional understandings of aggression, let alone meet the threshold of war, disinformation is proving to be extremely problematic to deter within Western liberal democracies—not least due to the difficulties in sourcing attribution (EU DisinfoLab, 2020). Such challenges are contributing to a growing perception within state policy of a more risky, complex and uncertain world (Joseph, 2016). In light of these developments, state decision-making is seen to be much more fragmented and weakly coordinated to be capable of dealing with the rapid pace of technical and ecological change (Duit et al., 2010). It is for this reason that there is a growing recognition that responding to each national security threat is now realistically beyond the scope of the state’s security apparatus (Braw, 2020). This has led to increased calls for the strengthening of societal

resilience—a concept that emphasises personal responsibility, awareness and

self-regulation in the face of increasing shocks and uncertainty (Joseph, 2016, p. 14). Western states, multilateral organisations and NGOs are increasingly advocating for societal resilience to combat and mitigate the sort of challenges posed by threats such as disinformation (Atlantic Council, 2019; Council of the European Union, 2020; NATO, 2020) “Governing complexity is thereby understood to be a process whereby failures or unintended outcomes can be seen as an inevitable part of that process and the key aspect is how failure is reflected upon to shape future policy-making” (Chandler, 2014, p. 12).

Etymologically, the term ‘resilience’ originates from the Latin verb resilire (to rebound or recoil), which entered the English language sometime in the early 17th century—undergoing

(15)

its form, the capacity of an ecosystem to tolerate shock and recover, the ability of children and adults alike to tolerate traumatic and adverse situations, and the ability of a system to withstand and overcome crises in a complex world (McAslan, 2010). Today, however, particularly in the field of international security, resilience generally refers to society’s ability (encompassing all of its individuals) to respond and recover from the impact of shocks—including society’s ability to adapt through learning, changing and re-organising to better cope with future threats (Cutter et al., 2008, p. 599). Fundamentally, resilience relates to a system’s capacity and not some outcome, a point that is conceptually vital for understanding the concept’s underlying essence—which is primarily about improving the individual’s ability to make decisions over their own lives in order to build more resilient societies (Béné et al., 2016, p. 125). In the words of Chandler (2012) “the resilient subject (at both individual and collective levels) is never conceived as passive or as lacking agency, but is conceived only as an active agent, capable of achieving self-transformation” (Chandler, 2012, p. 217). Chandler recognises that resilience is a normative concept, on account of its relative quality. For if one wants to adequately measure it, resilience necessitates being placed in relation to some preconceived outcome, since it is not something that is directly observable in and of itself (Sturgess, 2015, p. 7). So in practical terms, resilience is context-specific and necessitates tailor-made solutions to address specific risks (European Commission, 2017, p. 23).

Since resilience is ultimately a normative capacity enabling endeavour, the question then becomes a matter of determining the appropriate means to build it. In the case of strengthening resilience against disinformation, practitioners generally adopt a systems-oriented approach— focusing on exogenous influences and proposing top-down initiatives to shape them, for instance like creating fact-checking institutions or communicating positive narratives to the targeted audience (European Commission, 2017, p. 16). Such interventions are what Lazer et al., (2018, p. 1095) refer to as ‘structural’ interventions—whereby practitioners strive to shape external influences to prevent individuals from being exposed to disinformation in the first place. But there are clear risks in leaning too much on this approach, as Lezer et al., note: “fact checking might even be counterproductive under certain circumstances” (Lazer et al., 2018, p. 1095). By way of example, a 2014 randomised trial study that set out to increase child vaccination rates by testing the effectiveness of counterattitudinal persuasive interventions turned out to be massively detrimental to the desired outcome (Nyhan et al., 2014). The test sought to reduce misunderstandings surrounding vaccine scepticism, by using factual communications to clear up general misconceptions, an endeavour that ended up having zero impact on increasing parental intent to vaccinate a future child. On top of that, parents who had

(16)

the least favourable vaccine attitudes going into the study, were reportedly even less likely to vaccinate their children after these deliberate interventions. The implication of these findings suggests that for distrustful people who are likely to resist persuasion, the targeted attitude often becomes significantly stronger after attempted counterattitudinal persuasive interventions, like fact-checking—a conclusion that is supported by the psychological literature (Tormala & Petty, 2004). Likewise, these findings may indicate that concentrating one’s intervention strategy against disinformation through attempts to shape exogenous factors at the expense of focusing on improving the individual’s cognition may be counterproductive towards building effective resilience. In the words of Julia Koller, a lead developer for learning solutions: “information is only as reliable as the people who are receiving it. If readers do not change or improve their ability to seek out and identify reliable information sources, the information environment will not improve” (Pew Research Center, 2017).

3.2. Reflections on Resilience

Against this background, it seems clear that certain resilience strategies are going to be unsatisfactory, if not harmful at times to the desired capacity that resilience advocates are seeking if individual cognition is overlooked. In contrast to focusing on exogenous influences, an approach that focuses on the individual one-on one, like an education initiative that seeks to strengthen a person’s capacity to critically evaluate the information they consume, rather than telling them what is true, places the agency of the individual most in need of help front and centre (Chandler, 2012, p. 216). Research suggests that such bottom-up interventions, even on the small-scale, are rather effective at assisting an individual’s ability to perceive the accuracy of false information (Guess et al., 2020). If we recall Chandler’s understanding of resilience mentioned earlier, resilience can be understood as the symbiosis of a group of individual capacities coming together. Reasoning from this fact, societal resilience is simply individual

capacity at scale. To suggest that resilience transcends the individual in some way, as some

scholars do (Humprecht et al., 2020), is to arguably commit a fallacy of misplaced concreteness—mistaking something quite abstract for the way things are in actuality. Likewise, to prioritise shaping exogenous factors within a resilience strategy in lieu of building individual capacity, is to potentially undermine the very endeavour of resilience building entirely.

3.3. Resilience Framework

Since this thesis aims to identify and analyse different types of resilience strategies being employed by the UK and Sweden, the following analytical framework provides a

(17)

predetermined structure for classifying the different forms of strategies that will be identified in the forthcoming analysis. In view of the preceding considerations concerning individual-focused interventions and structural-individual-focused interventions inspired by Lazer et al.’s (2018) dichotomy of interventions against fake news, the following analytical framework is designed to adequately describe a state’s resilience strategy through two-distinct strategy types. The first type of strategy is focused on strengthening an individual’s ability to make decisions over their own lives, whilst the second type of strategy is focused on exercising top-down initiatives that aim to shape exogenous influences. To be capable of identifying these distinct strategies within a body of text, there ought to be clear indicators for each type of resilience strategy to guide the forthcoming analysis.

The first type of resilience strategy of interest to this study is specifically concerned with efforts to strengthen the individual’s ability to evaluate the information they are exposed to, which will be referred to as a micro strategy for the purposes of this study (see table 1). To locate a micro strategy within a piece of content, the researcher will search for the combination of (1) a learning effect—constituting some desired effect that is believed to enhance an individual’s ability to independently evaluate the accuracy of information and (2) the necessary means— constituting the specified solution to achieve the desired learning outcome in question.

Table 1

Logic of a Micro Strategy

Element Indicator Identifier

Learning Outcome A sentence constituting some desired effect that is believed to increase an individual’s capacity to independently evaluate the accuracy of information.

L

Necessary Means A sentence constituting the solution to achieve L N

The second type of resilience strategy that this study is interested in finding is specifically concerned with state efforts to influence exogenous factors for the purposes of reducing societal disinformation exposure, which will be referred to as a macro strategy for the purposes of this study (see table 2). To locate a macro strategy within a piece of content, the researcher will search for (1) a structural effect—constituting some desired effect that is believed to reduce the risk of disinformation in the environment and (2) a policy approach—the presumed solution to achieve the desired structural effect.

(18)

Table 2

Logic of a Macro Strategy

Element Indicator Identifier

Structural Effect A sentence constituting some desired effect that is believed to reduce the risk of disinformation in the environment.

S

(19)

4. Methodology

This chapter presents the methodological strategy that guided the forthcoming analysis to ensure that this study’s research question was reliably answered to the best of the researcher’s ability.

4.1. Design

This thesis chose to conduct a qualitative comparative content analysis for the purpose of answering the research question—how does the United Kingdom compare with/differ from

Sweden in the type of disinformation resilience strategy it employs? A qualitative approach is

ideally suited for the investigation of meanings in context, that is, permitting the researcher to act as the research instrument in order to gather and interpret the underlying meanings behind the gathered data (Merriam & Tisdell, 2015, p. 2). Given that the UK and Sweden’s resilience strategies are scattered throughout copious publications, absent of classification or grouping, a qualitative approach seems best to capture, codify and describe the different dimensions of strategy that are being employed to combat disinformation. The comparative element of this approach constitutes a systematic, rule-based approach for the analysis of “informational contents of textual data” (Forman & Damschroder, 2007, p. 39). Which provide two mutually reinforcing purposes for this study, which are (1) contextual context—to enable the researcher to know what the UK and Sweden’s resilience strategies are like and (2) classification—to reduce the complexity of the UK and Sweden’s resilience strategies through the placement of empirical evidence into relevant ‘data containers’ (Landman, 2008, p. 4).

4.2. Source Discussion

This study sought to understand the different types of resilience strategies that are currently being practised by the UK and Swedish government and compare them. Therefore, the official government websites of the UK and Sweden were chosen to source the primary data for this analysis on account of their direct representation of the government’s official policies. The researcher recognises how the chosen sources may diminish the study’s content validity— which refers to whether the researcher is measuring all the things that ought to be measured in relation to the intended construct (Creswell & Creswell, 2018, p. 215). One could argue that the choice to exclusively source this study’s data from the official government websites is inadequate to faithfully represent the full extent of the UK and Sweden’s resilience strategies. Which is to say, different organs of state and their respective publications may very well have provided a richer understanding of the state’s resilience strategy than solely focusing on one

(20)

source. Acknowledging this consideration, an abridged representation of the UK and Sweden’s resilience strategy is likely the best that can be achieved within the scope of this study and its chosen sources. Likewise, the study’s choice to exclusively focus on state policy clearly omits the role that civil society has towards strengthening societal resilience against disinformation, which may have afforded a richer understanding of resilience should it have been taken into consideration (Aslama, 2019, 'Meso Level: Public Media and Collaborations'). However, it is with regret that there was simply not enough time to incorporate additional sources that may have been relevant to the purposes of this study.

4.3. Data Collection Process

The study adopted a multi-stage purposeful sampling approach for selecting the data for this study’s analysis—which is commonly used for the selection of cases related to information-rich phenomena (Palinkas et al., 2015). Purposeful sampling refers to when the researcher has selected their data “based on a specific purpose rather than randomly” (Tashakkori & Teddlie, 2003, p. 713) The study sourced all data pertaining to the UK from gov.uk—the official website of the UK government. Likewise, the researcher sourced all data pertaining to Sweden from

regeringen.se—the official website of the government of Sweden. It must be noted that all data

sourced from regeringen.se is in Swedish and required translation into English by the researcher for the purposes of this analysis. Hence, all representations presented within this study of Sweden’s resilience strategy are the product of the researcher’s best attempts to faithfully represent the position of the Swedish government in the English language.

4.3.1. Part One: Content Selection

In the first stage of the data collection process, the researcher searched for the term ‘disinformation’ (UK) and ‘desinformation’ (Sweden) using the respective search functions of both websites. The search process identified 241 results from gov.uk (UK) and 231 results from

regeringen.se (Sweden). It should be noted that there were often additional resources contained

within the results that generally represented the main focus of the content (e.g. An attached PDF report). In such instances, all attached publications were analysed in addition to the parent item, so long as they fulfilled the appropriate selection criteria (see table 3).

(21)

Table 3

Content Selection Rules

Rule Criteria

Search terms: ‘disinformation’ (UK) & ‘desinformation’ (Sweden) Eligible content types: Web page, .PDF

Eligible date range: January 1st, 2014 to May 1st, 2021. Languages: English (UK) and Swedish (Sweden)

4.3.2. Part Two: Strategy Identification

The next stage of the data collection process involved the analysis of all 241 results from gov.uk (UK) and all 231 results from regeringen.se (Sweden). The analytical framework outlined in the previous chapter provided the basis for the following sub-questions to guide this process:

1. What strategies have the United Kingdom and Sweden employed to strengthen the

individual’s ability to evaluate the information they are exposed to? (Micro)

2. What strategies have the United Kingdom and Sweden employed to reduce societal

disinformation exposure? (Macro)

The unit of analysis within the chosen dataset is the sentence, which was coded accordingly. Using the analytical framework outlined in the previous chapter, all text from the 241 results from gov.uk (UK) and 231 results from regeringen.se (Sweden) were examined to detect the combination of any two elements that qualify as a micro strategy or macro strategy respectively (see table 4).

Table 4

Strategy Identification Rules

Type Element Indicator Identifier

Micr o Str ateg y (i nd ivid ua l)

Learning Effect A sentence constituting some desired effect that is believed to increase an individual’s capacity to independently evaluate the accuracy of information.

L

(22)

Ma cro S tr ateg y (ex og en ou s)

Structural Effect A sentence constituting some desired effect that is believed to reduce the risk of disinformation in the environment.

S

Policy Approach A sentence constituting the presumed solution to achieve S

P

The particular constellation of words and statements within a sentence that qualify as an

element by meeting the aforementioned criteria were referred to as a meaning unit for the

purposes of coding (see table 5). A meaning unit refers to a sentence that corresponds to the same central meaning, whether that is a learning effect, the necessary means, a structural effect, or a policy approach (Graneheim & Lundman, 2004, p. 106). It should be noted that the discovery of a single element within a publication (e.g. a learning effect) without its corresponding supporting element (e.g. the necessary means) fails to qualify as a complete strategy and was disregarded accordingly. On the other hand, content that contains more than one corresponding supporting element (e.g. the necessary means or a policy approach) to a desired effect was aggregated to the strategy in question. Lastly, any duplicate strategies identified throughout the course of the analysis were discarded for the purposes of clarity.

Each strategy identified within the analysis was given a unique three-digit identification number—with the letters ‘MI’ placed beforehand to denote a micro strategy (e.g. MI001) and the letters ‘MA’ placed beforehand to denote a macro strategy (e.g. MA001). Likewise, each piece of content that revealed a strategy during the analysis was provided a unique three-digit identification number for the purposes of ‘source’ classification—with the letters ‘UK’ placed beforehand to denote the item’s British origins (e.g. UK001) and the letters ‘SE’ placed beforehand to denote the item’s Swedish origins (e.g. SE001).

Table 5

Example of a Macro Strategy

ID Source Element Meaning Units

MA001 UK001 S “empower independent media”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

(23)

Any supplementary publications (e.g. An attached PDF report) found within a piece of content that revealed a strategy was given the same unique three-digit identification as its parent content holder. Content that bared no strategies or produced duplicate strategies were discarded and omitted from the classification process. Finally, all content that produced at least one strategy can be found in the references chapter under ‘data collection’.

4.4. Analytical Discussion

Upon completion of the data collection process the analysis progressed with the development of subcategories—a core feature of any qualitative content analysis (Graneheim & Lundman, 2004, p. 107). Subcategories were determined via pattern discovery between strategies that share a high degree of commonality (Krippendorff, 2004, p. 50). Krippendorff notes how these categories must be mutually exclusive of one another, which is what enables the data of a content analysis to be “informative” (Krippendorff, 2004, p. 155). Likewise, since the comparative aspect of this study sought to understand the differences between the actors under investigation, the facilitation of mutually exclusive categories provides the logic that substantiates the differences between the UK and Sweden’s resilience strategies should they arise.

4.5. Reliability

Reliability generally refers to the consistency of a particular research tool, procedure or approach in different circumstances, granting that nothing else has changed (Roberts et al., 2006, p. 41). In respect to the qualitative research approach undertaken in this study, reliability refers to the consistency of the researcher’s analytical procedures (Noble & Smith, 2015, p. 34). Since the researcher is acting as the instrument, would the chosen procedures of this study provide consistent results if different researchers applied the same procedures in different research settings? One way to have mitigated this risk would have been to perform an intercoder reliability test to ensure that one’s coding procedures would provide consistent results with different researchers (Chambliss & Schutt, 2019, p. 98). However, it is with regret that this study was unable to perform the necessary diligence checks on account of the time constraints surrounding this research. Therefore, this study acknowledges the diminished reliability of the forthcoming implemented approach.

(24)

5. Analysis

From the 472 pieces of content examined in this analysis (241 items from gov.uk and 231 items from regeringen.se), 43 British and 17 Swedish pieces of content unveiled 74 strategies admissible to this study’s analytical framework.

5.1. Micro Strategies

From the 74 strategies discovered in this study, 15 micro strategies were identified that sought to strengthen the individual’s ability to evaluate the information they are exposed to—eight via British sources and seven via Swedish sources. From these 15 micro strategies, two distinct sub-strategies were conceived in accordance with the commonalities shared between them.

5.1.1. Media and Information Literacy

The first sub-strategy derived from the collection of micro strategies is the media and

information literacy sub-strategy. This type of micro strategy seeks to strengthen an

individual’s critical thinking skills with a particular emphasis on improving their competence in navigating the digital environment. Educational practices to evaluate, use and create information are advocated to fulfil this aim within this type of strategy. Both the UK and Sweden practised this form of sub-strategy, which was identified on five occasions via UK sources and on five occasions via Swedish sources.

5.1.1.1. United Kingdom Results:

Table 6

British Media and Information Literacy sub-strategies

ID Source Element Meaning Units

MI001 UK006 L “help [individuals] think critically about things they might come across online, like disinformation”

N “online media literacy strategy”

MI002 UK014 L “help increase user awareness of, and resilience to, disinformation and misinformation online”

N “The [online harms] regulatory framework will build on Ofcom’s existing duties to promote media literacy”

(25)

MI003 UK014 L “enabling people to critically assess, appraise and challenge information online”

N “The forthcoming online media literacy strategy will set out more action to improve and strengthen audience resilience”

MI004 UK016 L “develop people’s knowledge and confidence in navigating the online world and the information promulgated through it”

N “promoting work already in train across library services, and [the DCMS Libraries Team's] important role in education”

MI005 UK032 L “enabling Ukrainian youth to better discern fact from fiction in the media and social media space and to make informed decisions as to which information they consume, share and produce”

N “UK programme assistance to Ukraine”

N “media literacy and critical thinking for secondary schools”

5.1.1.2. Sweden Results:

Table 7

Swedish Media and Information Literacysub-strategies

ID Source Element Meaning Units

MI006 SE006 L “A strong knowledge base can also contribute to science-based policy and practice and build resilience to disinformation”

N “The Government therefore considers that scientific publications, which are the result of research funded by public funds, shall be immediately available with effect from 2021”

MI007 SE008 L “develop critical thinking”

N “The school is the social institution that has the task of systematically and over time imparting knowledge and values to all children and young people in Sweden”

N “In the case of pre-school, compulsory school and upper secondary school, parts of the education can strengthen children and young people’s

information literacy and resilience against disinformation, propaganda and online hate”

(26)

MI008 SE008 L “increase awareness of disinformation, propaganda and online hate and to spread knowledge about media and information literacy and other resistance-building methods to as many people as possible”

N “an important starting point for the implementation of the external work has been to cooperate with and meet actors who pass on knowledge to others, so-called intermediaries”

N “Intermediaries can be organisations or individuals who, in its business has the potential to reach out to many people”

MI009 SE011 L “promote more digital competence within the general public, including in MIK issues”

N “The Royal Library has received SEK 25 million annually 2018–2020 for one investment called 'Digital first'”

MI010 SE013 L “strengthen digital competence and media and information literacy”

N “The Government instructs the Swedish Media Council to develop frameworks for a strengthened collaboration of media and information literacy (MIK) initiatives as of the 1st of October 2018”

5.1.2. Digital Educational Material

The second and final sub-strategy derived from the collection of micro strategies is the digital

educational material sub-strategy. This type of micro strategy aims to strengthen an

individual’s critical thinking skills with a particular emphasis on raising disinformation awareness. Simple online checklists, governmental guidance publications and digital classroom material are methods generally employed to achieve this aim within this type of strategy. Both the UK and Sweden practised this form of sub-strategy, which was identified on three occasions via UK sources and on two occasions via Swedish sources.

5.1.2.1. United Kingdom Results:

Table 8

British Digital Educational Material sub-strategies

ID Source Element Meaning Units

(27)

N “use the SHARE checklist to make sure you’re not contributing to the spread of harmful content: Source - make sure information comes from a trusted source Headline - always read beyond the headline Analyse - check the facts Retouched - does the image or video look as though it has been doctored? Error - look out for bad grammar and spelling”

MI012 UK035 L “identify, assess and respond to disinformation”

N “the ‘RESIST’ toolkit, which enables organisations to develop a strategic counter-disinformation capability”

MI013 UK035 L “increase the audience’s ability to spot disinformation”

N “providing them with straightforward advice to help them check whether content is likely to be false or intentionally misleading”

N “behaviour change campaign [S.H.A.R.E]”

5.1.2.2. Sweden Results:

Table 9

Swedish Digital Educational Material sub-strategies

ID Source Element Meaning Units

MI014 SE007 L “stimulates critical thinking and source criticism”

N The Living History Forum has produced the digital classroom material "Propaganda - Risk of Influence" which explains the mechanisms of propaganda”

MI015 SE016 L “to distinguish facts and independent reporting from fake news and disinformation”

N “The government therefore decided earlier this year on strengthened digital competence in both curricula and degree objectives and individual course and subject plans”

N “The State Media Council has, amongst other things, produced and developed the digital education material ‘MIK for me’ and educational material about propaganda and the power of images, for children and young people”

(28)

5.2. Macro Strategies

From the 74 strategies discovered in this study, 59 macro strategies were identified that sought to reduce societal disinformation exposure—46 via British sources and 13 via Swedish sources. From these 59 macro strategies, seven distinct sub-strategies were conceived in accordance with the commonalities shared between them.

5.2.1. Independent Journalism and Media

The first sub-strategy derived from the collection of macro strategies is the independent

journalism and media sub-strategy. This type of macro strategy aims to create a pluralistic

media landscape in and around a disinformation threat actor. In the case of the UK and Sweden, both states sought to cultivate a pluralistic media landscape in and around Russia’s near abroad on account of both states recognising Russia as the biggest threat actor in the dissemination of disinformation (Swedish Government, 2017; UK Government, 2021, p. 75). Policies aiming to create, support and promote independent journalism and media are typically employed to facilitate this end. Both the UK and Sweden practised this form of sub-strategy, which was identified on 14 occasions via UK sources and on two occasions via Swedish sources.

5.2.1.1. United Kingdom Results:

Table 10

British Independent Journalism and Media sub-strategies

ID Source Element Meaning Units

MA001 UK002 S “empower independent media”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

MA002 UK003 S “to support independent media, especially in Russia’s near abroad” P “the Counter Disinformation and Media Development programme will

fund initiatives to understand and expose the disinformation threat”

MA003 UK018 S “increasing capacity and professionalism of [Kyrgyzstan’s] journalists” P “the British Embassy Bishkek is looking to support two projects for

activity before March 2021”

(29)

P “The UK has announced it is doubling its support to independent media, human rights organisations and community groups in Belarus, with an extra £1.5m for projects over the next two years”

MA005 UK021 S “enhance knowledge and understanding of emerging foreign and security policy issues among Czech communities by through activities to support quality independent journalism on [hybrid threats, including countering disinformation, promoting media literacy and cyber security]”

P “The Prague Programme Fund is a small and short-term funding mechanism, which allows British Embassy Prague to support local organisations seeking to deliver real and measurable outcomes”

MA006 UK022 S “Egypt: a project training journalists on countering disinformation and fake news”

P “The UK ran a major international campaign in 2019 on Media Freedom”

MA007 UK031 S “empower independent media [in the Eastern Europe and Central Asia (EECA) region]”

P “building capacity and raising the professionalism of journalists, as well as providing higher quality products to local audiences, helping counter disinformation”

MA008 UK033 S “counter disinformation across Eastern Europe and strengthen independent media in the Western Balkans”

P “£18 million over 3 years will support freedom of expression and strengthen independent media”

P “The funding from the Conflict, Stability and Security Fund (CSSF) will support freedom of expression and independent local voices in the Western Balkans to boost the creation of balanced, non-biased content” P “The funding for Eastern Europe and Central Asia is part of a £100

million, 5-year commitment to counter disinformation and support independent media”

MA009 UK034 S “support independent media in Ukraine”

P “£9 million project which will strengthen societal resilience to disinformation and help increase Government accountability by

developing independent sources of information in Ukraine and across the Eastern Partnership countries”

MA010 UK037 S “building the capacity of independent media outlets in Ukraine to hold power accountable and enable more informed and active citizens”

P “Support to National Anti-corruption Bureau of Ukraine with development of an information security management system (ISMS) strategy”

(30)

MA011 UK037 S “support peace-building efforts by helping establish an independent media space for people in the region to engage with decision makers and civil society”

P “supports the civic radio broadcaster, Hromadske Radio, to provide unbiased, factual reporting and news to the east of Ukraine (including the conflict affected areas), where access to independent media is limited and Russian disinformation is readily available”

MA012 UK038 S “build resilience to Russian disinformation and build plurality and balance across media landscapes”

P “joint actions [UK and Poland] aimed at supporting independent media in Eastern Partnership countries”

MA013 UK039 S “countering Russian disinformation”

P “more investment in public service and independent media operating in the Russian language, both through projects in the Baltic States, Ukraine, Moldova and Georgia”

P “And through reinvigorating the BBC Russia Service as an independent source of news for Russian speakers”

MA014 UK041 S “support trust- and peace-building efforts by offering an alternative media space for the people in the region [Donetsk and Luhansk regions] to engage with decision makers and civil society in the news and views format”

P “Funding: £350,000”

P “Supports a newly established civic radio broadcaster in becoming an independent and trustworthy information provider to the region, where trust in the central government is low, access to independent media limited and impact of Russian disinformation significant”

P “funding supports installation of 16 FM transmitters in Ukraine-controlled areas along the contact line and “grey zone” in the east, reaching an estimated audience of 2 million people”

5.2.1.2. Sweden Results:

Table 11

Swedish Independent Journalism and Media sub-strategies

ID Source Element Meaning Units

MA015 SE010 S “A pluralistic media landscape [in Russia]”

P “support for independent media in a broad sense, both traditional and new media”

(31)

P “In this area, cooperation with the EU and the European External Action Service 'East StratCom Task Force' is important”

MA016 SE017 S “Support free and independent media in the Baltic countries, Ukraine and in the Eastern Partnership”

P “Through the Swedish Institute and the Nordic Council of Ministers, for example, we have supported the new independent Russian-speaking public service channel in Estonia”

5.2.2. Strategic Communication

The second sub-strategy derived from the collection of macro strategies is the strategic

communication sub-strategy. This type of macro strategy aims to shape the information

environment to command the strategic narrative. Fact checking, targeted messaging, communicative capacity building or information dissemination were all policy approaches that were commonly favoured to achieve this end. Both the UK and Sweden practised this form of sub-strategy, which was identified on 17 occasions via UK sources and on five occasions via Swedish sources.

5.2.2.1. United Kingdom Results:

Table 12

British Strategic Communication sub-strategies

ID Source Element Meaning Units

MA017 UK001 S “tackle harmful disinformation and inaccurate reporting around the world” P “£8 million of funding for BBC World Service”

MA018 UK003 S “improve our response to disinformation campaigns”

P “Investment in the Government’s behavioural science expertise, horizon scanning and strategic communications”

MA019 UK004 S “tackle mis- and disinformation among ethnic minorities”

P “the government is regularly producing myth-busting content and utilising trusted platforms and messengers within communities and taking specific targeting approaches on social media channels (such as Facebook and Instagram which allows for better targeting)”

P “We are also using native language publisher sites and targeting specific media outlets (Asian Voice, Leader, The Nation, JC and Desi Express) as part of ongoing partnership work”

(32)

MA020 UK005 S “mythbust false information about COVID-19 and the vaccine” P “A cross-government Counter Disinformation Unit”

MA021 UK007 S “address vaccine disinformation”

P “engagement at local level via trusted religious and community leaders, sharing examples of what is known to work well in nearby areas, and encouraging community-led efforts to address vaccine disinformation”

MA022 UK008 S “dispel any vaccine myths and disinformation” P “established a network of 'Community Champions'”

P “the champions' role developed to become 'vaccine champions' to ensure as many residents as possible are vaccinated, whilst at the same time helping dispel any vaccine myths and disinformation”

P “By the end of 2020, there were 600 community champions”

MA023 UK011 S “countering vaccine disinformation”

P “transparency, openness and proactive and positive communications”

MA024 UK020 S “tackle disinformation about this revolutionary mobile technology [5G]” P “new guidance on the safety and benefits of 5G so councils can give

people the facts”

MA025 UK022 S “reduce the impact of Russian disinformation across wider Europe” P “From 2017/18 until 2019/20, we have spent £62 million, to reduce the

impact of Russian disinformation across wider Europe, through our Counter Disinformation and Media Development programme, funded by the Conflict, Stability and Security Fund (CSSF)”

MA026 UK023 S “combatting a range of harmful online narratives”

P “A small team from the Ministry of Defence, including members of 77 Brigade, is supporting the Cabinet Office’s Rapid Response Unit in its efforts to tackle disinformation”

MA027 UK023 S “combat the spread of harmful, false and misleading narratives” P “deploying two British Army experts to NATO’s new COVID-19

Communications Hub, where they are helping to lead the fight against disinformation”

References

Related documents

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

Both Brazil and Sweden have made bilateral cooperation in areas of technology and innovation a top priority. It has been formalized in a series of agreements and made explicit

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Av tabellen framgår att det behövs utförlig information om de projekt som genomförs vid instituten. Då Tillväxtanalys ska föreslå en metod som kan visa hur institutens verksamhet

“Trophy.” While browsing through the British Museum’s home- page, looking at how the British Museum represents its collection of objects looted by British soldiers from Benin City

It is, nevertheless, still a corporation, with all the corporate governance mechanisms available, such as capital structure, managerial labour markets, executive

The aim of this study is to identify linguistic traits of political rhetoric, propaganda language and politeness strategies as a means of gaining power used by Bush and Kerry,

Aim of study: The aim of the thesis is three-fold: (1) to compare the strategy process within Lindex with what Kaplan and Norton advocates; (2) to describe the components Lindex