• No results found

A case study of a verification project during the 2019 Indian election

N/A
N/A
Protected

Academic year: 2021

Share "A case study of a verification project during the 2019 Indian election "

Copied!
75
0
0

Loading.... (view fulltext now)

Full text

(1)

CHECKPOINT

A case study of a verification project during the 2019 Indian election

By: Linus Svensson

Supervisor: Walid Al-Saqaf

Södertörn University | School of Social Sciences Bachelor’s essay 15 credits

Subject | Spring semester 2019 Journalism and Multimedia

(2)

Abstract

This thesis examines the Checkpoint research project and verification initiative that was introduced to address misinformation in private messaging applications during the 2019 Indian general election.

Over two months, throughout the seven phases of the election, a team of analysts verified election related misinformation spread on the closed messaging network WhatsApp. Building on new automated technology, the project introduced a WhatsApp tipline which allowed users of the application to submit content to a team of analysts that verified user-generated content in an unprecedented way. The thesis presents a detailed ethnographic account of the

implementation of the verification project. Ethnographic fieldwork has been combined with a series of semi-structured interviews in which analysts are underlining the challenges they faced throughout the project.

Among the challenges, this study found that India’s legal framework limited the scope of the project so that the organisers had to change approach from an editorial project to one that was research based. Another problem touched the methodology of verification. Analysts perceived the use of online verification tools as a limiting factor when verifying content, as they

experienced a need for more traditional journalistic verification methods. Technology was also a limiting factor. The tipline was quickly flooded with verification requests, the majority of which were unverifiable, and the team had to sort the queries manually. Existing

technology such as image match check could be further implemented to deal more efficiently with multiple queries in future projects.

Keywords: verification, collaboration, fact-checking, misinformation, India

(3)

This study was made possible by funding from the Swedish International Development Cooperation Agency, SIDA, through the Minor Field Studies program.

(4)

Table of Content

1 Introduction ... 1

1.1 Purpose of study ... 2

2 Background ... 4

2.1 The ‘WhatsApp murders’ ... 4

2.2 Internet penetration and connectivity in India ... 5

2.3 Political propaganda and disinformation ... 6

2.4 Response to the misinformation epidemic ... 7

3 Theoretical Framework & Literature overview ... 9

3.1 Journalism as a discipline of verification ... 9

3.2 The fact-checking movement ... 12

3.2.1 Terminology around fake news ... 13

3.3 The Indian context ... 14

3.3.1 Motivation for spreading misinformation ... 14

4 Methodology ... 16

4.1 Participant observation ... 16

4.1.1 A regular day ... 17

4.2 Semi-structured interviews ... 18

5 Findings and Discussion ... 20

5.1 Stakeholders ... 20

5.1.1 Pop-Up Newsroom ... 20

5.1.2 PROTO ... 21

5.2 Laying the ground for Checkpoint ... 22

5.3 The Checkpoint team ... 24

5.4 Launching Checkpoint ... 25

5.5 The verification procedure ... 26

5.6 Crowdsourcing messages from the WhatsApp tipline ... 27

5.7 Sorting user requests ... 30

5.7.1 Deciding what to verify ... 30

5.8 Monitoring social media ... 34

5.9 Methodology of verification ... 35

5.9.1 Use of official sources ... 35

5.9.2 A deviation from methodology ... 45

5.9.3 Setting a verdict ... 47

5.10 Evaluation ... 49

5.10.1 A gradually improved verification process ... 51

5.10.2 Limitations of online verification tools ... 52

5.10.3 Lack of clarity in the research process ... 54

5.10.4 Role of Facebook – too little too late? ... 56

(5)

6 Conclusion ... 58 References ... 60

(6)

Table of Figures

Figure 1. Screenshot of a tweet received through the tipline (Check). ... 30 Figure 2. A meme received through the tipline (Check). ... 31 Figure 3. Screenshot of a manipulated image received via the tipline. The text “NaMo again!”

has been added to the boy’s t-shirt (Check). ... 37 Figure 4. Screenshot of the Check verification task list. Analysts followed the task list and checked each box upon completion of the verification step (Check). ... 38 Figure 5. A screenshot of a tweet received via the tipline. The tweet could be traced to

Narendra Modi’s official Twitter handle and proved to be authentic (Check). ... 39 Figure 6. A manipulated image depicting candidate Kanhaiya Kumar (Communist Party of India, CPI) as standing in front of a distorted map (Check). ... 42 Figure 7. Screenshot of a Facebook post. In the meme, it is argued that the Gandhi family enriched themselves whilst the ISRO was being underfunded. ... 43 Figure 8. A photo of Abhinandan’s doppelganger (Check). ... 47

(7)

1 Introduction

In November 2018, ahead of the 2019 general election, fact-checkers and journalists from across the industry met in New Delhi to attend a workshop, seeking to define some of the key challenges that information disorder imposes on the industry and society at large. The

workshop was organised by Pop-Up Newsroom, an organisation founded by media innovators Dig Deeper Media and Meedan, and hosted by civic media start-up Proto. Participants

reached a consensus that rumours and misinformation spread on encrypted platforms1, such as the messaging network WhatsApp (which was acquired by Facebook in 2014), is one of the biggest challenges faced by fact-checkers and journalists alike and a serious threat to Indian democracy. Participants discussed how a collaborative project could address this challenge (Bell, F., personal communication, May 23, 2019).

The workshop resulted in the Checkpoint research project, commissioned by Facebook. Proto, a partner of the International Center for Journalists, ran the operation on the ground from its office in New Delhi. The organisational framework was designed by Dig Deeper Media.

Checkpoint sought to map the misinformation ecosystem on encrypted platforms and to identify election-related misinformation patterns. For this purpose, it introduced a WhatsApp tipline, building on new automated technology, which allowed a team of analysts to gather and verify user-generated content in an unprecedented way. This was made possible thanks to technological assistance from Meedan and WhatsApp (Proto, 2019).

Misinformation would be crowdsourced from regular WhatsApp users whom were

encouraged to submit “suspicious” claims they encountered on the private messaging app.

Other than just collecting data, Checkpoint analysts were to verify these claims and send back verification reports to users (ibid.).

Over the past few years Dig Deeper Media and Meedan have organised a series of so called Pop-Up Newsrooms – temporary, collaborative reporting initiatives, often focused on fact- checking – in countries all over the world (see Electionland, 2016,. Martínez-Carrillo &

Tamul, 2019., WAN IFRA, 2019). The Pop-Up Newsroom concept can be summarised under

(8)

the slogan ‘innovation through collaboration’. By building joint projects, involving actors from the media industry and beyond, they hope to generate insights and find solutions to the key challenges that the media industry faces today (see Pop-Up Newsroom, u.d.).

The phenomenon could be seen in the light of a rising global fact-checking movement, one that “widens boundaries of fact-checking as journalistic practice” (Graves 2018, p. 617) by transcending national borders and different disciplinary fields such as civil society, academia and the technology sector.

Although Checkpoint was not a pure fact-checking initiative like previous Pop-Up

Newsrooms, it still dealt with a core aspect of fact-checking: the discipline of verification.

The project was also designed based on workflows, technology and key insights from previous projects. It thus carried some of the significant traits of the Pop-Up Newsroom concept, adjusted to the Indian context.

1.1 Purpose of study

This study examines how the Checkpoint project crowdsourced and verified user-generated content from WhatsApp during the 2019 Indian general election. In a time when user-

generated content has become an integral part of journalism, new demands on verification are raised as can be exemplified by BBC’s UGC hub (see BBC, 2017).

Verification is a central task in fact-checking and journalism, but, as we shall see, it is not equal to fact-checking. The study examines the methodology of verification, as adopted by Checkpoint, and how it was implemented during the verification effort.

The study also seeks to examine a trend of international collaborative media projects led by Pop-Up Newsroom. Checkpoint will pose as a case study to understand how the pop-up concept travels across borders and adjusts to unique circumstances, in this case the context of the Indian election. I thereby strive to answer Graves’ (2018) call for more research on how

“institutional ties beyond journalism” affects practice (p. 627).

Furthermore, I hope to shed light on a notable gap in the research on fact-checking and misinformation in India. Previous studies have examined political fact-checking processes

(9)

and misinformation primarily in an American context (see Graves, 2013), but there is a lack of research focusing on the Indian subcontinent.

For these purposes, the study poses the following research questions:

RQ1: How was the Checkpoint project implemented to tackle mis/disinformation during the 2019 Indian elections?

RQ2: What obstacles and challenges did the project face during its implementation?

RQ3: How did the team members perceive the successes and failures of the project?

RQ1 seeks to lay the foundation to this thesis by presenting how the frameworks and workflows were implemented in the project and examining how analysts crowdsourced and verified user-generated content from the WhatsApp tipline. RQ2 examines the challenges its stakeholders faced while implementing the project. RQ3 seeks to evaluate the project by giving emphasis to the experiences of the involved team members.

The project went on for four months, spanning over the whole election through two phases.

First, the data collection phase sought to collect crowd-sourced data from the official WhatsApp tipline. I will also refer to this phase as the verification phase, since the

verification effort was enrolled simultaneously. The verification phase will be the focus of this study, which builds on some 300 hours of ethnographic fieldwork within the workplace combined with semi-structured interviews with the team members of the Checkpoint project.

The post-election data analysis phase saw analysts from the team conducting a content- analysis on the amassed data. This subsequent phase is out of scope for this study, as I was not present during this time. The findings of the Checkpoint team will be published by the International Center for Journalists in a separate report independent from this thesis.

(10)

2 Background

This chapter illustrates the impact of information disorder in India. It examines the technological context, in which recent years development has created the conditions for a thriving misinformation ecosystem, where WhatsApp has become an important

communication tool and carrier of misinformation. It also examines how political parties have contributed to that ecosystem. Lastly, I present an overview of some of the measures that different stakeholders have taken to contain the spread of misinformation. It shows that Facebook has taken a more pro-active stance in its fight against misinformation, in which the Checkpoint project is only one of several responses that Facebook has initiated.

2.1 The ‘WhatsApp murders’

On July 13, 2018 Mohammad Salman and his friend Mohammad Azam were attacked and killed by a lynch mob in a small village in Karnataka. The mob claimed that the two were part of a child abduction ring. Mr Salman barely escaped and survived the beatings, albeit with severe injuries. He last saw his friend, Mr Azam, dragged away by the mob with a noose around his neck. Mr Azam later died from his injuries, according to media reports (Satish, 2018).

The mob attacked the two men after rumours, sparked by a viral video, had circulated in local WhatsApp groups. In the video, two men on a motorcycle can be seen abducting a child on a street. The video warned Indians of a child abduction ring operating in the country, with the intent to kidnap children and harvest their organs. However, the video proved to be fake. Not only was the video shot in neighbouring Pakistan – the sequence had in fact been cut out of a Pakistani kidnap awareness video (Elliott, 2018).

Still, the video and its resulting rumours got traction all over the country, resulting in a series of attacks on innocent victims. The incidents linked to the child abduction rumours form part of the notorious ‘WhatsApp murders’, as dubbed by the some media outlets, in which at least 33 people were reportedly killed by lynch mobs as a result of misinformation spread on the platform between January 2017 and July 2018 (India Spend, 2018; Chaudhuri & Jha, 2019) (Safi, 2018).

(11)

2.2 Internet penetration and connectivity in India

India, with its 390 million internet users, has the second greatest population on the Internet after China. Although internet penetration in the country is low, it is increasing rapidly – some 30 percent of the Indian population is connected to the internet. In 2015, the amount of connected users grew by 40 per cent, to 277 million people, higher than the previous year’s growth rate of 33 per cent (Kaur & Nair, 2018).

The development is largely due to a trend in decreasing rates of mobile data and greater availability of affordable smart phones. The entry of Indian telecom firm Reliance Jio into the Internet service provider market resulted in decreased prices and affordable data plans (Kaur

& Nair, p. 2). In 2019, India offers mobile data at the cheapest rate in the world (Cable, 2019).

With some 430 million smartphones users, India is the second largest market for smart phones, second only after China (Livemint, 2019).

From August 2013 to February 2017 the number of users connected to messaging platform WhatsApp rose from 30 million to 200 million users (Statista 2019), making India the platform’s biggest global market (Iyengar 2019). An annual report published by Reuters Institute for the Study of Journalism suggests that a majority of Indians consumes news from their smartphones, as claimed by 68 % of its respondents. The report revealed that WhatsApp is the biggest platform in India, used by 82 % of respondents, while 52 % said they got news from the messaging application (Aneez, et al., 2019).

WhatsApp, as other social media networks, has made it easier for people to share news and information with each other. It also facilitates consuming and creating multimedia content, particularly effective in a country like India where the literacy level is relatively low. The wide use of groups within the app paired with the forward function, which allows users to spread information by the click of a button, makes WhatsApp a “potent medium for reaching out to masses” (Farooq, p. 107).

The debate remains unsettled among scholars as to whether or not the technological development and the surge of social media have enhanced political participation. Some scholars argue that the technological development enhanced online mobilization around

(12)

voters continued to participate online, while poorer and less educated citizens were unable to participate effectively due to limited knowledge and technological access” (Chadha & Guha 2016, p. 4390). Yet, the rise of Internet connectivity has prompted the political parties to change their approach in communicating with the electorate (see Chadha & Guha).

2.3 Political propaganda and disinformation

The social media wings of the political parties, more commonly referred to as ‘it-cells’, have embraced social media as a tool for political campaigning. The governing Bharatiya Janata Party, or BJP, was one of the early entrepreneurs in the matter (see Chadha & Guha, 2015).

The party’s use of social media to spread its political message is often mentioned as a key factor to explain its success in the 2014 Lok Sabha elections2, when 66.4 % of registered voters turned out to vote in favour of the party, making it the first party to score an absolute majority in parliament since 1984 (Chadha & Guha, 2015). Using a grass-roots approach, where voters and volunteers were reached via social media channels, the party saw an

“unprecedented involvement of ordinary citizens”, who took to social networks to “engage potential supporters by sharing campaign-related materials such as videos and memes and encouraging them to mobilize others to volunteer and donate as well.” (Chadha & Guha 2015).

This led to the creation of “hundreds of small cells” all over India. According to Chopra (2014), their objective was to: “pick the news, put up pictures and articles that criticize the ruling Congress party and praise Modi or the BJP. They are the online crusaders who actively counter anti-Modi coverage” (p. 56).

In their interviews with party volunteers, Chadha & Guha (2014) found that ready-made campaign material was distributed from the top to the grass roots level. The material consisted of “a variety of images, posters, charts, and infographics that highlighted successes in BJP- ruled states” (p. 4399). Many of the memes and hashtags that were shared by volunteers were also mandated from the top level, such as the trending hashtag #AbkibaarModiSarkaar (“this

2The Indian general elections.

(13)

time a Modi government”). The interviewees expressed that they were instructed to actively avoid “polarizing issues such as religion” (p. 4400).

However, media reports suggest that disinformation often originates from the it-cells.

According to Bloomberg, 300 workers were hired by the BJP it-cell to “inflame sectarian differences, malign the Muslim minority, and portray Modi as saviour of the Hindus”

(Bloomberg 2018). Another report published by Newslaundry claims that BJP it-cell workers in Uttar Pradesh, India’s most populated state, were mandated to spread propagandistic or factually incorrect messages in WhatsApp groups to woo voters during the 2017 Legislative Assembly election (Bhardwaj 2017).

Due to a lack of transparency, it is difficult to hold party officials liable for disinformation spread on social media networks and closed messaging applications. As Campbell-Smith &

Bradshaw (2019) put it, “relying on volunteers and paid workers allows the blurring of boundaries between campaigning, trolling and propaganda” (p. 5). This makes it hard to distinguish the spread of disinformation by unpaid volunteers, acting on their own mandate, and those hired by party it-cells.

At times, misinformation on social networks have seeped through verification filters at mainstream media outlets. The terrorist attack by Pakistan-based terrorist organisation Jaish- e-Muhammad in Kashmir, in which 40 Indian soldiers were killed, triggered a wave of online disinformation. Mainstream channels in India and Pakistan published news stories that amplified rumours and misinformation about the attack (Campbell-Smith & Bradshaw 2019, p. 1). In 2017, fact-checker Alt-News identified a number of “fake news stories” that were published by reputable news outlets such as Zee News, India Today and The Hindu (Jawed 2018).

2.4 Response to the misinformation epidemic

In December 2016, Facebook enrolled its fact-checking program. Independent fact-checking partners, verified through the International Fact-Checking Network (IFCN), fact-checks and

(14)

rates posts on the platform3, submitted by users. After fact-checkers have rated a post as false, Facebook places it lower in the newsfeed, reducing future views by over 80% on average.

Pages that frequently distributes content rated as false by partners will get their distribution reduced on the platform (Lyons, 2018).

In February 2019, ahead of the Indian general elections, Facebook announced that it was expanding the fact-checking program in the country, adding five more partners to the

network. Fact-checkers such as India Today Group, Factly and Fact Crescendo joined the list of partners (PTI, 2019), increasing their number to a total of eight organisations (Facebook u.d.)

On April 1, 2019, Facebook took down 687 pages and accounts for engaging in “coordinated inauthentic behavior” on the platform. The pages and accounts were linked to individuals associated with a Indian National Congress4, INC, it-cell. From August 2014 until March 2019, the accounts had spent a total of 39,000 US dollars on Facebook ads (Gleicher 2019).

Another 15 pages, linked to the Indian IT firm Silver Touch, were taken down. Silver Touch has been associated with the BJP, for whom it developed the NaMo app, an app featuring pro BJP news (Patel & Chaudhuri 2019). The pages spent a total of 70,000 US dollars on ads from June 2014 to February 2019.

WhatsApp has been pressured by the Indian government to counter the spread of

misinformation on its platform. In July 2018, the IT Ministry issued a statement containing a stern warning: “If they [WhatsApp] remain mute spectators they are liable to be treated as abettors and thereafter face consequent legal action” (PIB, 2018).

WhatsApp has since introduced new features on its platform, such as limiting the forwarding function to a number of five groups per forwarded message and also labelling the messages with a “Forwarded” tag (WhatsApp, 2018a,. WhatsApp, 2018b). In August 2019, it presented the “Frequently Forwarded” function to alert its Indian users of messages that have been forwarded five or more times (Carlsen, 2019).

3In August 2019, Facebook expanded its fact-checking program in the US to cover Instagram for its American audience (Tardáguila, 2019).

4Indian National Congress is the political party that has governed the Indian republic for most of its history.

(15)

The Indian government itself has taken measures to curb misinformation spread on social media with Internet shutdowns in affected areas. According to a report by Freedom House (2018), the country “leads the world in the number of internet shutdowns, with over 100 reported incidents in 2018 alone.” The report concludes this strategy to be a “blunt

instrument”, as it interrupts not only the spread of disinformation but also the use of regular online services (Shahbaz 2018). Anecdotal evidence also suggest that the spread of

misinformation continues in spite of internet shutdowns (Funke, et al., 2019).

Legal measures have also been taken. The controversial Section 66A of the Information Technology Act criminalised distribution of “offensive content” online, but was deemed unconstitutional by the Supreme Court in May 2015. Still, several people have been arrested and charged under Section 66A (Johari 2019). On May 9, 2019, BJP worker Priyanka Sharma was arrested after she shared a political meme on Facebook targeting West Bengal chief minister Mamata Bannerjee. The charge was later dropped, and the Supreme Court ordered the immediate release of Sharma, on the condition that she made a public apology (Anand Choudhary 2019). The event sparked a debate about how legislation encroaches on freedom of speech.

3 Theoretical Framework & Literature overview

3.1 Journalism as a discipline of verification

The “correspondence” theory of truth views truth as something that “corresponds to the facts of reality”. Facts, indisputable in their nature, exist outside of systems of value and are not subject to interpretation (David in: Graves, 2017, p. 520). In the nineteenth century, journalists saw themselves as purveyors of truth. They unearthed these facts and presented them to their audiences – news reflected reality. Schudson calls this “naïve empiricism”

(Schudson, 2001 in: Graves, 2017). Kovach & Rosenstiel note a similar school of thought among journalists, the concept of realism. Realism is the perception that truth is graspable in the form of facts – facts that speak for themselves, and by simply collecting and presenting them, journalists could purvey the truth to its audience. In the first half of the twentieth century, journalists began to worry about the naivete of realism, as they developed a “greater

(16)

prejudices. The influential American journalist Walter Lippman called for a new method, in line with “the scientific spirit”, which did not ignore human subjectivity, but used certain mechanisms to minimize this subjectivity and in such a way get to the truth. This lays the ground for the modern objectivity ideal. “The call for objectivity was an appeal for journalists to develop a consistent method of testing information–a transparent approach to evidence–

precisely so that personal and cultural biases would not undermine the accuracy of their work.” (p. 101).

In a democratic system, the core of journalism is to give citizens the information they need to make informed decisions. Journalism’s first obligation is therefore to the truth, as Kovach &

Rosenstiel (2014) write in The Elements of Journalism. The truth-seeking in journalism is what differentiates it from propaganda, entertainment, fiction or art. Kovach & Rosenstiel define this primary function of journalism as a ‘Journalism of Verification’. However, the rise of the 24-7 news cycle, fuelled by the twenty-first century’s rapid digitalisation, the growth of the Internet and the fragmentation of audiences, factors such as speed and competition have been given precedence over verification.

The development has pushed journalism in other directions. Kovach & Rosenstiel distinguish several veins of journalism that have changed the logic in media production. The authors notes a shift from a journalism of verification to a ‘Journalism of Affirmation’. As the

digitalised media landscape, revolutionised by the Internet, fragmented audiences, a new type of journalism arose where audiences were reached by reassurance and “the affirming of preconceptions” (Kovach & Rosenstiel 2014, p. 64). The ‘Journalism of Aggregation’ are the new platforms that aggregate content from media outlets without verifying the content

themselves, and by recommendations or algorithms make the news readily available for others.

The conception of these new strains in journalism is putting higher demands on the audience as “The burden of verification has been passed incrementally from the news deliverer to the consumer” (Kovach & Rosenstiel 2014, p. 65).

Despite these changes, the media commonly claim objectiveness by emphasizing their impartiality. This is usually done by the narrative of a “neutral voice”. A story is balanced by including different points of view, and can thus achieve an appearance of fairness due to the

(17)

sole fact that two sides are presented equally. There are always many sides to a story, but fairness and balance should never be invoked for their own purpose or as the goal of journalism, the authors argue (p. 109).

For instance, if there is a consensus among scientists that the effects of global warming are real, it would be a disservice to truthfulness and to the audience if journalists would give as much space to both sides of the debate in the name of impartiality.

Balance is not always a means to get at the truth, but can be used by the media to claim impartiality; “a veneer atop something hollow”.

Years before, Tuchman (1972) noted the same phenomena. She saw objectivity among

‘newspapermen’ as a strategic ritual to defend their work from public criticism. The practice of objectivity, as claimed by journalists, consists of different procedures. Through

presentation of conflicting possibilities (what Kovach & Rosenstiel calls “balancing a story”) multiple statements by differing sides in a conflict are presented. These statements are treated as equally valid truth-claims, although the facts might not have been verified, or perhaps aren’t verifiable. The ‘newspaperman’ claims objectivity by presenting both sides of the conflict, leaving it to the reader to evaluate both truth-claims.

Another such procedure is the judicious use of quotation marks, in which the journalist removes his/her presence from the story by citing interviewees or statements from others, telling the story through quotes rather than through the voice of the reporter. In fact, the reporting might still be subject to selective bias as the journalist masks his own opinion under citations aligned with his own sympathies.

Such procedures can at most be said to be tools used to obtain objectivity, however they cannot reach a true objective practice, according to Tuchman.

Tuchman further elaborates on the objectivity ideal in Making News (1978). Journalism can never truly reflect reality, since journalism cannot be truly objective.

News is a window on the world. Through its frame, Americans learn of themselves and others, of their own institutions, their leaders, and life styles, and those of other nations and their peoples […]

(18)

But, like any frame that delineates a world, the news frame may be considered problematic. The view from a window depends upon whether the window is large or small, has many panes or few, whether the glass is opaque or clear, whether the window faces a street or a backyard. The unfolding scene also depends upon where one stands, far or near, craning one’s neck to the side, or gazing straight ahead, eyes parallel to the wall in which the window is encased (Tuchman, 1978, p. 1).

3.2 The fact-checking movement

The fact-checking movement emerged as a “reformer’s critique of conventional journalism”

(Graves, 2013, p. 127) seeking to ”revitalize the ‘truth-seeking’ tradition in the field” (Graves, 2017). Graves (2013), much like Tuchman, saw the problem of journalism using objectivity as a blanket cover. Graves noted that journalists are more concerned about including multiple statements from differing parts than to actually verify those statements. He refers to this as

“he said, she said” reporting.

Fact-checking as a practice first emerged in the U.S. during the early nineties, with

newspapers fact-checking deceptive advertisements in presidential races (pp. 130-131). But it was not until the beginning of the second Millenia that dedicated fact-checker entities

emerged. In 2003, Fact-Check.org was launched, followed by PolitiFact and the Washington Post’s Fact Checker column in 2007.

Fact-checking should be seen as “a practical truth-seeking endeavor” (Graves, 2017, p. 523).

It is defined by Graves as the practice of “assessing the truth of public claims” made by public figures, e.g. politicians or pundits (Graves, 2013). Graves wrote his dissertation in 2013, a time before alternative facts and fake news entered the common vocabulary5. Arguably, Graves’ definition of fact-checking has become less applicable today as it does not reflect the challenges that fact-checkers are facing, when misinformation and disinformation spread on

5The two terms are problematic. Fake news implies that news can be true or fake, when news by definition has to be factual. If it is not, it is not news but rather dis/misinformation or propaganda. Likewise, the term alternative facts implies that facts are disputable, when by definition the word fact is used to assert indisputability.

(19)

social networks. Neither does it fully reflect the reality of practice by today’s fact-checking movement. For instance, Facebook’s fact-checking program is exclusively targeting

disinformation and misinformation spread on its social platforms (see Facebook, u.d.). As the misinformation ecosystem evolves, and new efforts are introduced to address it, more

research is needed.

3.2.1 Terminology around fake news

“Fake News” was nominated as the word of the year by the American Dialect Society in 2017. Ben Zimmer, chair of the American Dialect Society’s New Words Committee, motivated the nomination as follows:

When President Trump latched on to fake news early in 2017, he often used it as a rhetorical bludgeon to disparage any news report that he happened to disagree with. That obscured the earlier use of fake news for

misinformation or disinformation spread online, as was seen on social media during the 2016 presidential campaign (American Dialect Society, 2018).

Fake news is historically not a new phenomenon, but the term became popularised during the 2016 American presidential campaign. It arose to describe fake news articles spread by illegitimate news sites, disguised as reputable news outlets, with the intent to mislead (see Allcott & Gentzkow, 2017). However, fake news also comes in other formats. In a country like India, disinformation is commonly spread in the shape of memes and messages on the private messaging platform WhatsApp (BBC, 2018).

Fake news is arguably a rather blunt and obscure term to reflect the reality of disinformation today. This, paired with the fact that its use has been transformed into “rhetorical bludgeon”, calls for its replacement by more specific terms.

A more useful approach is to define false information after the intent behind which it is spread. Throughout this thesis, I will use the terms disinformation and misinformation. The

(20)

terms have been defined by Dr. Claire Wardle, a research fellow specialised in information disorder, as follows.

Disinformation is false information that is deliberately created or disseminated with the express purpose to cause harm. Producers of disinformation typically have political, financial, psychological, or social motivations.

[…]

Misinformation is information that is false, but not intended to cause harm.

For example, individuals who don’t know a piece of information is false may spread it on social media in an attempt to be helpful. (Wardle, 2018).

3.3 The Indian context

Despite the emergent situation of misinformation in India, and a growing number of fact- checking initiatives, there is a gap in research examining this context. The Indian context imposes new challenges, unbeknownst to the American tradition of fact-checking, such as dealing with content in a wide array of languages. Other notable differences in the

misinformation landscape are the relative absence of textual misinformation, and the prevalence of visual information in the form of memes (BBC 2018, p. 15). The spread of misinformation on the end-to-end encrypted messaging service WhatsApp also poses different challenges, and requires a different approach.

3.3.1 Motivation for spreading misinformation

In a report conducted by the BBC, researchers analysed a sample of ‘fake news messages’

spread on WhatsApp and interviewed Indian citizens to find out what were their reasons for sharing information (and potentially misinformation) on social media networks.

The report found that among the reasons behind sharing behaviour, “sharing as a civic duty”

was one of the most important, to spread a message that they deemed to be in public interest (BBC, 2018, p. 44). The findings align with the results of a survey conducted by Indian fact-

(21)

checker Factly, in which 48.5 % of the respondents said their main reason for sharing information as “It might benefit others” (Pratima & Dubbudu, 2019, p. 44).

The massive amount of information that Indians are encountering seems to have blurred the lines between what is traditionally seen as news – information disseminated by newspapers, TV, and radio stations – and other competing sources. The researchers call the phenomenon

‘the digital deluge’ – when different types of information are available in the same space.

Traditional news is mixed with news about familiar and personal matters, in the Facebook

‘news feed’ as well as in WhatsApp, where users are often part of several groups dedicated to family members, colleagues and politics (BBC 2018, p. 23). Since “every type of ‘news’ is in the same space, ‘fake news’ too can be hosted there” (p. 40).

They conclude that WhatsApp works in part as an echo chamber, where “usage is about validation of one’s beliefs and identities through the sharing of news and information” in groups closely associated with one’s political, cultural and social beliefs (p. 36).

A sample of ‘fake news messages’ spread on WhatsApp suggested that a majority of the misinformation was not directly political. The researchers found that 36.5 % of the fake news messages consisted of content that could be categorised as “Scares and scams”, while only 22.4 % could be categorized as “Domestic news and politics”. 29.9 % of the messages were categorized as “National myths” (BBC 2018, p. 43).

The sample of fake news that the researchers looked at suggested that misinformation among the Right was found to be united in pro-Hindu sentiments. The researchers found that it usually revolved around Hindutva ideology, or Hindu nationalism, anti-minority sentiments directed toward Muslims and support of prime minister Narendra Modi (pp. 64-72). Among the Left, the fake news messages were not as strongly united to an agenda, but when it was it usually disfavoured the ruling Bharatiya Janata Party, BJP, and Narendra Modi (pp. 72-75). In total, their data sample suggested a bigger share of the fake news messages were found among the Right. However, as the researchers point out, other statistical measures would have to be taken to confirm this.

(22)

4 Methodology

4.1 Participant observation

This study is based on data that I collected as a participant-observer within the Checkpoint team, drawing upon two months – some 300 hours – of ethnographic field work. As a participant-observer, I have gathered information about aspects related to workflows and methodology by observing the everyday work, unexpected events or informal conversations between team members. These observations have been noted on daily basis.

Participant observation gives the researcher a unique opportunity to study editorial processes and decisions made in the workplace. Rather than only analysing the output, the researcher gets a “behind the scenes” approach to follow processes and intra-organisational forces behind the resulting output, thus making the “invisible visible”. Furthermore, it offers the researcher a possibility to observe material that never made it to the production, or was later discarded, as well as the discussions that lead up to that decision (Cottle 2009, p. 10).

Nonetheless, participant observation, as every other method, has its downsides. By focusing too much on newsroom practices the researcher might miss out on extra-organisational forces such as economic, technologic or political pressure and how they affect the work environment (Cottle 2009, p. 13). It is up to the researcher to make a conscious effort to correlate

professional practices and organisational tendencies with such extra-organisational forces.

A problematic situation can arise if the participant-observer becomes too involved in the work, leaving the observation behind and becoming a fully engaged participant. As the researcher participate in the work, he runs the risk of influencing the workflow, changing the professional practices in the newsroom thus compromising the reliability of the study. It is important that the researcher is conscious of the way his presence and activities influence the workplace, as well as how it can compromise his role as an observer.

However, shifting approach to a more participating stance, when balanced, can be beneficial.

In order to understand the workflow and the professional practices it is often necessary for the researcher to dedicate some time to gain hands-on experience by doing the same tasks as everyone else. Personal relations with other participants can improve much to the advantage

(23)

of the researcher. The researcher can be seen as one in the team, whereas from a strictly observing approach he can be seen as an outsider.

I entered the Checkpoint team as a participant-observer on the condition that I would help with some tasks where help was needed. I agreed to this arrangement, provided that the Checkpoint leadership would not interfere in my work as a researcher. I did not see this arrangement as compromising my role as a participant-observer, as participating in some of the tasks, I found, was absolutely essential. I needed to spend time working on daily tasks in order to get an understanding of the methodology, the tools and the software used.

Participating in the everyday work did not mean that I left my role as a researcher behind, as I continuously took notes about my involvement in all tasks.

The leadership proved to be much understanding that my primary task at the project was to do independent research, and thus I could balance my time between helping with tasks and conducting interviews or observe as I saw fit.

4.1.1 A regular day

Every day started with a morning meeting. I would take notes to summarize what was said and by whom. As the day went by I would walk from desktop to desktop and ask the team members questions about their tasks. These were informal conversations, where I enquired about the piece of content they were working with in that particular moment. Sometimes I chose to stay with an analyst as they proceeded with verification. This was done in a

subjective manner, whenever I deemed something to be interesting I stayed with that person to observe the verification process, what steps were taken to reach a verdict, what decisions were made and what challenges the analyst faced.

Later during the day I would follow up with the team member to see how their work had progressed throughout the day. Every time an analyst had completed a piece, one of the team leaders, practically editors, would evaluate the analyst’s work before a verification report card was sent out to the original user. The analyst and the editor would have a short conversation, and if the editor thought that the verification report needed changes or additional information, the analyst would do this according to the instructions from the editor, who had the final say. I

(24)

Throughout the two months that Checkpoint was operating the team received thousands of queries. Because of the massive inflow of user requests, I could not observe each and every item. I would personally, using my own judgement, decide which items were of interest for my research and select them accordingly.

By the end of each day, I would review my notes and add personal reflections. These reflections touched any interest of matter and were thought to be used for the purpose of analysis and discussion in this thesis.

4.2 Semi-structured interviews

To complement ethnographic field word, I have conducted a series of semi-structured interviews, fifteen in total. Nearly all team members have been interviewed including analysts, team leaders and the founders of PROTO. I have also interviewed Fergus Bell, a consultant from Dig Deeper Media who helped design the framework of the project. Two team members were not interviewed, one being an intern that joined the project later in the process and the other team member was not interviewed because of the language barrier preventing a meaningful dialogue.

The interviews were conducted in English, which was the main language of communication in the work environment. The interviews span from 30-45 minutes each. The first eight

interviews were conducted in April, the first month of the project. As the verification phase came to its end, in late May, another seven interviews were conducted. Some of these were follow-up interviews with previous interviewees.

Prior to the interviews, the respondents were informed about the purpose of the study and gave their consent to participate as interviewees, what Brinkman & Kvale (2014) call informed consent. The analysts were offered confidentiality, whereas those with senior positions were not. The latter were offered transcripts of the interviews prior to the

publication of this thesis, since their names and the quotes attributed to them would be public.

Although none disputed the collected information, they were given an opportunity to do so.

(25)

The qualitative interview seeks to understand the world from the point of view of its

participants and to draw meaning from their experiences (Kvale & Brinkman, 2014). In this study, interviews were centred around a series of topics ranging from methodology of verification, evaluation of the project, opinions on misinformation and measures to tackle it, as well as the roles of the involved stakeholders.

Each interview has been dealt with on a case-by-case basis. Interviews were personalised, and in each case questions have been added or omitted depending on the seniority level or

specialisation of the interviewee. Since the interviews have been conducted over a period of two months, adjustments have been made over time to correspond with real-time events in the workplace; addressing challenging situations faced by the participants or important decisions that impacted their work.

The semi-structured interviews have been used to triangulate and complement ethnographic observation, seeking to extract information that has not been directly observable in the workplace environment. By cross-referencing observations with interviews, the researcher can also discover discrepancies or continuity between statements made by the interviewees and their observed practices in the work space (Cottle 2009, p. 11).

The scientific utility of qualitative interviews, or lack thereof, has received a fair share of critique in the social sciences. A common objection to the method is that the qualitative interview is not scientific since it reflects a common sense worldview expressed by the interviewee. It is argued that the interview is subjective rather than objective and builds its result upon the biases of the interviewee. The nature of the interview is personal, since it builds upon relations between the interviewer and the interviewee and requires some degree of flexibility, which in turn compromises the rigorousness of the methodological framework.

Studies based on qualitative interviews often draw upon a low number of interviews, rendering a result with low generalisability (Brinkman & Kvale, 2014, pp. 210 – 213).

However, the authors point out that there is no authoritative definition of science according to which the interview can be categorised as scientific or unscientific. Many of the weaknesses in qualitative interviews can rather be seen as strengths in a qualitative study. Interviews give the researcher unique access to the world of the interviewee. The subjective nature of the

(26)

differences in personal perspective that let the researcher enhance qualitative understanding of a certain phenomenon (ibid.).

5 Findings and Discussion

5.1 Stakeholders

Checkpoint was conducted by Proto, an Indian media skilling start-up. The framework of the project was designed by Pop-Up Newsroom, a joint project between Dig Deeper Media and Meedan. Meedan provided technological assistance to set up the tipline and Dig Deeper Media offered consultancy for the local team. Facebook provided funding for the project through its affiliate WhatsApp.

5.1.1 Pop-Up Newsroom

Pop-Up Newsroom was founded in 2017 by Fergus Bell of Dig Deeper Media and Tom Trewinnard of Meedan. It strives to nurture newsroom innovation by initiating collaborative reporting efforts in different countries and contexts, connecting journalists and fact-checkers within the media industry and putting them in the same room as technologists and academics.

A series of such Pop-Up Newsrooms have been conducted in the past – mainly, but not exclusively, focusing on curbing misinformation. These projects include Electionland – a virtual newsroom that covered polling related issues on the election day of the 2016 American presidential election (see Electionland, 2016) and Verificado, a collaborative fact-checking initiative spanning over two months during the Mexican election (see Martínez-Carrillo &

Tamul, 2019., WAN IFRA, 2019). The former involved some 1100 journalists across the United States and the latter some 100 journalists from 60 media partners (ibid.).

At times Bell and Trewinnard have organised Pop-Up Newsrooms involving students for similar projects. In September 2018, students from three Swedish journalism schools set up a newsroom – Riksdagsvalet 2018 – seeking to verify misinformation spread on social networks ahead of the election (see Mattsson & Parthasarathi, 2018). I personally took part in this project as an undergraduate student journalist.

(27)

From January until August 2019, Bell and Trewinnard were involved in various projects centred around curbing misinformation – from Tsek.ph in the Philippines, CekFakta in Indonesia and the target of this case study, the research project Checkpoint.

Each new project builds on experience and key insights from the last, but allows for adaptation to every unique context.

“The reason we need something like Pop-Up Newsroom is that it allows us to iterate and build on the previous version rather than everyone starting from scratch. And that allows us to innovate faster and to move the journalism industry forward” (Fergus Bell, personal

communication, May 23, 2019).

More projects have been planned for the remainder of 2019, such as Reverso, a fact-checking initiative in Argentina, and Election Exchange which will be enrolled during the US 2020 election campaign (see Reverso, u.d., Marrelli, 2019).

5.1.2 PROTO

Proto is a civic media start-up that was founded in 2018 by ICFJ Knight fellows Nasr ul Hadi and Ritvvij Parrikh. The Knight Fellowships, a program by the International Center for Journalists (ICFJ), is “designed to instill a culture of news innovation and experimentation worldwide” and through collaboration with the news industry “seed new ideas and services that deepen coverage, expand news delivery and engage citizens, with the ultimate goal to improve people’s lives” (ICFJ). The team in India focusses on reinventing news production and strengthening reporting on areas such as ”health, gender and development issues” (ICFJ).

The primary approach for Proto is community based co-learning. Just like Pop-Up

Newsroom, the concept behind Proto is driven by the idea of innovation by collaboration.

Nasr ul Hadi and Ritvvij Parrikh believe that by bringing people from the news industry together, they can build meaningful relationships and learn from each other. To achieve this purpose, they organise weekly meet-ups and bootcamps at their office in New Delhi

surrounding pressing issues faced by the industry.

“We are not going to be able to go back to grad school and take a career pick and go and learn stuff that is new and cutting edge. The way to learn is going to come to these peer-to-peer

(28)

learning environments and showcase each other’s work and learning from hands-on sessions”

ul Hadi said (ul Hadi, N., private communication, May 30, 2019).

Proto directs its work at what ul Hadi calls three “crises” in media: credibility, adaptability and sustainability. The credibility crisis is defined as the media’s struggle to stay credible in a landscape of information disorder, adaptability is the struggle to keep up with the

technological challenges imposed on the industry and sustainability is about finding sustainable business models as media organisations see their ad-revenue decrease (ibid.).

Checkpoint was a data driven project which correspond with the credibility crisis.

5.2 Laying the ground for Checkpoint

In November 2018 Pop-Up Newsroom hosted a workshop together with PROTO at the latter’s premises in New Delhi. The workshop was attended by representatives across different disciplines, from fact-checkers and journalists to technologists and academicians.

Among the domestic fact-checkers, Factly and Alt-News were present, among journalists representatives came from outlets such as Times of India, The Indian Express, The Quint and The Deccan Herald (F. Bell, personal communication, 2019, May 23).

The agenda of the workshop was to identify the key challenges that information disorder imposes on the media as well as building a framework for potential solutions to curb the spread of misinformation and preventing its impact on the Indian 2019 Lok Sabha elections.

After having defined a mission statement – much focused on targeting communal rumours mainly spread through WhatsApp – the goal was to seek financial support to set up a joint fact-checking initiative, involving multiple stakeholders in the spirit of previous pop-up newsrooms (ibid.).

However, the original vision of such a collaborative fact-checking effort could not be realised.

Under the Foreign Contribution (Regulation) Act of 2010, non-Indian companies are prohibited from funding domestic media organisations or media projects in the country (FCRA, 2010). As Facebook – an American company – came to be the sole funder of the project, there was no way of initiating a media project with an editorial output communicated through broadcasting or other journalistic means and platforms.

(29)

Consequently, the resulting outcome took a vast turn from what was first envisioned. After months of discussions with Facebook, the involved parties had finally redefined how they could operate a project addressing the problem area as defined during the workshop. The result was Checkpoint – a research project commissioned by Facebook, executed by Proto with technological assistance from Pop-Up Newsroom’s founder Meedan, and consultancy regarding framework design by Dig Deeper Media (PROTO, 2019).

According to Bell, a research project, although a deviation from what was first envisioned, would still “achieve a lot of the same goals” without an editorial output (F. Bell, personal communication, 2019, May 23). Instead of actively fighting misinformation on WhatsApp as a pre-emptive measure, the project would gather unique data to better understand the type of misinformation that spreads in closed messaging networks during the election, generating insights for stakeholders in future projects.

The purpose of the research was to “map the misinformation landscape in India, especially misinformation related to the general election” and to generate “insights on misinformation that will be useful for journalists addressing civic issues in India” (Shalini Joshi, personal communication, April 17, 2019).

For a period of two months, spanning over all seven phases of the election, Checkpoint would crowdsource data from its WhatsApp tipline. By encouraging users to share ”suspicious”

content encountered on the encrypted platform, the team aimed to build a database of misinformation and rumours that would ”otherwise not be accessible” due to the encrypted nature of the messaging application (PROTO). By amassing this unique data, the team strived to map out misinformation patterns on WhatsApp.

Anyone could send a verification request to the Checkpoint team – in the form of a link, text, photo or video – and the team would assess the request accordingly by verifying the

authenticity of a claim or media file. However, verification was a secondary priority for the team, which would be dealt with according to the team’s capacity.

(30)

5.3 The Checkpoint team

At the point of peak team capacity, the team consisted of ten members, among them one intern. Eight were analysts, dealing with verification and sorting data. They were led by two team leaders, whose roles were similar to that of an editor.

The team members come from a variety of Indian states from all over the country e.g.

Uttarakhand, West Bengal, Bihar, Kerala, Telangana and Delhi. Most of them have a background in journalism, working for local, regional and national newspapers as well as broadcasters distributing news in English and regional languages. Two team members had a background in media training and research.

The tipline considered requests in five languages: Bengali, English, Hindi, Malayalam and Telugu. English is considered an urban language, spoken mostly in the cities, while the others are regional languages. To deal with multilingual verification requests, staff were hired on terms of linguistic abilities so that at least one language specialists was assigned to cover queries in each respective language. One language specialist dealt with user requests in Malayalam, another dealt with Telugu, and a third was responsible for Bengali content.

Everyone spoke English, which was also the language used for communication at the work place. Most could speak, read and write in Hindi – a northern Indian language – although with varying ability, due to the fact that the analysts came from different regions in India where Hindi is not the main language. Hindi was the second most spoken language in the work space after English (Author’s field notes, 15-04-2019).

Prior to the launch of the project, the team went through some basic training involving some of the tools used, such as Reveye Image Search and In-Vid for videos. Since the analysts were not very experienced with using online verification tools, “there was a lot of learning that people had to do very quickly” (Joshi, S., personal communication, May 28, 2019).

(31)

5.4 Launching Checkpoint

On April 2, 2019, Meedan announced the launch of the Checkpoint tipline (Meedan, 2019).

The announcement was amplified by some of the biggest media organisations in India and abroad (see Sachin Ravikumar & Rocha 2019,. Bhargava 2019,. Ganjoo 2019,. Purnell 2019).

Despite the press release stating the objective of the project as research, it was widely framed as an effort to fight misinformation during the Lok Sabha elections. The media attention soon shifted to critique and Checkpoint was caught in the medial crossfire after several online newspapers decided to inquire in the effectiveness of the tipline by sending verification requests – without receiving any response (see Haskins 2019,. Mac & Dixit 2019). PROTO answered by issuing a FAQ on its website, underlining that:

The Checkpoint tipline is primarily used to gather data for research, and is not a helpline that will be able to provide a response to every user. The information provided by users helps us understand potential misinformation in a particular message, and when possible, we will send back a message to users (PROTO, 2019).

Subsequently, The Economic Times concluded that the tipline was “of no use when it comes to spot and remove misinformation in the upcoming general elections” (The Economic Times, 2019).

Nasr ul Hadi, founder of PROTO, answered to the critique:

Even though the press announcement clearly said that this was a research project to understand how misinformation works during the elections within closed [messaging] networks, people understood it to basically mean that this is a helpline, if we send something in we will get a response back. That was beyond the scope and the bandwidth of the project (ul Hadi, N.,

personal communication, May 30, 2019).

The verification process – a time consuming task that occupied nearly the first two months of the project – was mainly done as a means to gather data. By sending out verification reports

(32)

the team hoped to encourage users to “participate in this research as “listening posts” and send more signals for analysis” (PROTO, 2019). As Shalini Joshi, co-teamleader, pointed out:

“People would not send us queries if we would not send out verification reports back. And so we’ll never know what is trending or what people want us to respond to if we don’t send out these verification reports” (Joshi, S., private communication, April 17, 2019).

5.5 The verification procedure

Based on ethnographic observation and interviews with the Checkpoint team, I present an account of the verification process including the following four steps, which will be covered in-depth in following sections.

I. Input: Crowdsourcing

The automated process during which verification requests were crowdsourced from users through WhatsApp and gathered in a database.

II. Sorting

User requests were evaluated by analysts who separated verifiable queries from those

unverifiable or otherwise out of scope. Verifiable queries were flagged and forwarded to one of two team leaders for review. Flagged verification requests were evaluated by team leaders and, if deemed verifiable, assigned to analysts for verification.

III. Verification

Analysts proceeded to verify items following a defined task list. Upon verification, items were graded by the following scale: “True”, “False”, “Misleading”, “Disputed” or

“Inconclusive”, then sent to a team leader for approval.

IV. Output: Verification report

The team leader reviewed the verification steps taken to reach a verdict. If approved, a final verdict was set and a report card was automatically sent to the user that submitted the initial verification request via the WhatsApp tipline. If a verdict lacked supporting evidence, the analyst responsible for the item was asked to look for additional evidence.

(33)

5.6 Crowdsourcing messages from the WhatsApp tipline

Indian fact-checkers commonly crowdsource claims from their audiences. Alt-News and Boom Live operate helplines on WhatsApp, from which they encourage users to forward misinformation spread in their private networks. For similar purposes, international fact- checking collaborations such as Comprova and Verificado used tiplines during the Brazilian and Mexican elections in 2018 (see Wardle, Pimenta, Conter, Dias, & Burgos, 2019; Owen, 2018).

The leads they get from their tiplines go through journalistic news valuing processes where criteria such as potential for virality is considered. Fact-checkers then decide if resources should be spent on verifying a claim. These tiplines are used as a compliment to other methods of sourcing input such as monitoring social networks for viral claims. Since fact- checkers are driven by journalistic principles, they want to create an editorial output exposed to a large audience. This usually involves sharing a debunked or verified claim in their social media channels to gain maximum traction (see Wardle, Pimenta, Conter, Dias & Burgos, 2019).

The logic behind Checkpoint was different compared to that of conventional fact-checkers.

Since it was primarily a research project it had no editorial output available to the public, nor any intention to present its verification reports to a large audience. Its purpose was to examine and analyse crowdsourced messages from the WhatsApp tipline and the verification process was limited to those messages.

What made the tipline unique was that it built on new technology which allowed some level of automation to handle user requests. Previous Pop-Up Newsroom initiatives such as

Verificado demanded that fact-checkers manually responded to the received queries (Joshi, S., personal communication, April 17, 2019), whereas during the Checkpoint project verification reports were automatically sent to WhatsApp users upon verification.

Meedan provided technological assistance for Checkpoint to make the tipline possible.

Together with WhatsApp they built an interface that integrates the WhatsApp Business API (an application interface) with Meedan’s platform Check. The WhatsApp Business API, a

(34)

Smooch – an omnichannel conversation API (Author’s field notes, April 5, 2019,. see also (Facebook, u.d.) (Smooch, u.d.).

Any user could add the tipline’s number in their phone book and send a message to the Checkpoint team – including text, an image or a link to a video (Proto, 2019). All messages entered a database on Check where analysts could overview received queries. The tipline was semi-automated, operated by a chatbot that interpreted received messages and responded to them according to a template of standardised responses. After having sent a message to the number, the user was asked to confirm if s/he wished to request the team to verify it. Upon confirmation, the API prompted the message to enter Check. The tipline maintained the end- to-end encryption, and the user was completely anonymized in the process. No personal metadata, such as the user’s location or phone number, was stored in the process.

Messages appeared as items in the Check database, which the team analysts would overview.

They would manually sort through the collected queries. By using a Google spreadsheet with a set of general guidelines, the ‘Standard Operating Procedures’, analysts sorted and flagged verifiable queries and separated them from those that were unverifiable by the methodological standards. User requests were evaluated from criteria such as polling-related issues and

separated from those that were not suited for verification, such as spam, opinion or satire.

After an item had been marked as out of scope, a message was automatically sent out to inform the end user that their message would not be verified.

When the tipline was launched on April 2 the Checkpoint team received an “overwhelming”

amount of user requests, as expressed by one team leader, with hundreds of WhatsApp messages coming in the first couple of hours (Author’s field notes, April 5, 2019). By the end of the day, that number had increased to some 25,000 items (Check). For a team of eight analysts, this inevitably led to a time consuming sorting process, since analysts had to sort each item manually. The majority of incoming items were not verifiable – a large portion was considered to be spam or otherwise falling out of scope for verification. This meant that a lot of resources had to be focused on filtering out thousands of unverifiable items. At least one analyst sat almost full time occupied by this laborious task (Author’s field notes, April 8, 2019).

(35)

Hence, one of the drawbacks to the project was the lack of automation, as Nasr ul Hadi, co- founder of PROTO, put it.

Technology was not ready for a lot of what the project required… and so that basically meant that a lot of what we would have done [had we had the time] would have been dealt with by the machine side of it before the humans got involved. We ended up having to throw people at these

problems, and that was not a very productive use of our time or motivation or headspace (ul Hadi, N., personal communication, May 30, 2019).

For some of the received queries there were dozens of duplicates. But Check had no identification feature within the software that could automatically cluster these duplicates.

Analysts had to go through identical items and manually cluster them to a parent file, copying the qualities from the parent file to the child file.

Identifying duplicated and related queries coming into the tipline […] We are miles away of doing that effectively but it’s simply because I don’t think not enough people have explored it or have been given enough time and resources to be able to do it. It’s not because it’s not possible. We now know what kind measures would need to be used to enable that clustering better […] for instance, a traditional approach might have been to find a way to look at related keywords, but in a project where most of your queries are not keywords-based – they’re visuals-based – we have to start with image match check which Meedan has already figured out as a problem which they want to be able to solve6. (ul Hadi, N., personal communication, May 30, 2019).

6Meedan has since improved this feature. According to media reports, clustering has now been improved so that

(36)

5.7 Sorting user requests

5.7.1 Deciding what to verify

Whereas misinformation touches all matters, from the trivial and mundane to the political, Checkpoint targeted election related misinformation surrounding issues of national interest – misinformation that could impact polling, law and order or be likely to incite violence. Claims about non-election related matters e.g. rumours about Bollywood celebrities, sports or other entertainment sectors were considered out-of-scope together with opinion related claims, conspiracy theories and satire.

Opinionated claims are generally avoided by fact-checkers, Graves (2017) notes, because

“value-laden claims cannot be tested for their correspondence to reality” (p. 520). Political opinions have their roots in different sets of values, and subjective in their nature, do not amount for verification. Political opinions are based on interpretations of facts. For instance, the following screenshot of a tweet criticizes prime minister Modi’s performance as a politician, accusing him of having to resort to drastic measures to gain votes (see Figure 1).

Each and every claim could be verified: did he release a movie? Did he start his own channel?

But the context of these claims is highly opinionated, and their link to the performance of the prime minister cannot be verified.

Figure 1. Screenshot of a tweet received through the tipline (Check).

(37)

When it was announced that the BJP had won the election, the following meme reached the tipline (see Figure 2). Just like the tweet, it was unverifiable in nature as all claims were opinionated.

Claims that the “Results Were Not At All Surprising” or that the biggest lesson of the election was that the “Opposition Should Try To Improve Themselves rather Than Just Hating The Ruling Party In Next 5 Years” were not factual claims, but claims based on values.

Figure 2. A meme received through the tipline (Check).

As previously mentioned, a majority of the queries was considered to be spam or otherwise not suited for verification, e.g. obscene or pornographic material, sponsored content, job advertisements, and even threats (Author’s field notes, April 5, 2019) (Standard Operating Procedures). Any claim in a language other than the five languages covered by the project was not dealt with (Author’s field notes, April 8, 2019) (PROTO, 2019).

Alleged violations of the Model Code of Conduct – a set of rules and guidelines regulating the conduct of political parties to ensure fair and free elections – served as a key area of interest to guide the team in the sorting process (Author’s field notes, April 5, 2019). As the Election

References

Related documents

Däremot utgör inte ekonomi ett hinder för den kommunala verksamheten, projektledaren belyser att de har “ganska mycket resurser och personal” att tillgå i kommunen ( Projektledare

Bariatric surgery in a national cohort of women: sociodemographics and obstetric outcomes.. Ann JOSEFSSON, MD, PhD*, Marie BLOMBERG, MD, PhD*, Marie

During the empirical study of GotCon‟s activities and the actions of organizational members, four key individuals that in different ways initiated and facilitated

The empirical data suggests that what Heimer (2017) explains to be the risk of existing competence losing its usefulness in the near future due to a technological shift, could be

Although the third dimension is the focus of this study, the one- and two-dimensional view on power will be explained below as well, as it helps understand Lukes

In order to understand the role follow-up plays in projects it is first important to recognize the context in which follow-up occurs in and to understand the basic factors

Whereas the modern live project of Birmingham School of Architecture emphasised the importance of providing students with practical, hands-on experience of the design and

Camera Definition: Close-up shot, High-angle shot, Panning (depth and width) Space Definition: Narrow Space?. Composition Type: