STOPPING FAKE NEWS The work practices of peer-to-peer counter propaganda Maria Haigh, Thomas Haigh, and Nadine I. Kozak When faced with a state-sponsored fake news campaign propagated over social media, in a process we dub “peer-to-peer propaganda,” a group of volunteer Ukrainian journalistic activists turned fact checking into a counter-propaganda weapon. We document the history of StopFake, describe its work practices, and situate them within the literatures on fact checking and online news practices. Our study of its work practices shows that StopFake employs the online media monitoring characteristic of modern journalism, but rather than imitating new stories it applies media literacy techniques to screen out fake news and inhibit its spread. StopFake evaluates news stories for signs of falsified evidence, such as manipulated or misrepresented images and quotes, whereas traditional fact-checking sites evaluate nuanced political claims but assume the accuracy of reporting. Drawing on work from science studies, we argue that attention of this kind to social processes demonstrates that scholars can acknowledge that narratives are socially constructed without having to treat all narratives as interchangeable. KEYWORDS fact checking; fake news; propaganda; Russia; trolling; Ukraine; virtual teams Introduction Russia responded quickly when a popular revolt unseated Ukraine’s pro-Russian president, Viktor Yanukovych, in February 2014. Within days its military had seized the Crimean Peninsula, which Russia almost immediately annexed. In the months that followed, Russia channeled arms, volunteers, intelligence operatives, and eventually active duty troops into Eastern Ukraine where they fomented a civil war. This was the story told outside Russia. Russian media reported a Central Intelligence Agency-engineered coup in which Nazis seized control of Ukraine and committed atrocity after atrocity. Russia extended its protection to Crimea after a spontaneous uprising by local militias, and it sent only humanitarian aid and civilian volunteers to Eastern Ukraine. This divergence of media narratives is not merely a nationalistic endorsement of Russia’s military campaign, but a crucial part of it. Russia is fighting a new kind of “hybrid warfare,” or “postmodern warfare,” in which military actions, propaganda, political activity, and online campaigns are seamlessly combined (Thomas 2014; Mitrokhin 2015). In 2017, Russia’s defense minister acknowledged the existence of an information warfare group within its military, saying that “propaganda needs to be clever, smart and efficient” (Isachenkov 2017). In this paper, we explore the new kinds of information work devised by StopFake, a volunteer organization, to fight this weaponization of fake news. Founded by young Ukrainian journalists in March 2014, StopFake drew selectively on Western practices of “fact checking,” an increasingly common and prominent activity in which journalists take a Journalism Studies, 2018 Vol. 19, No. 14, 2062–2087, https://oi.org/10.1080/1461670X.2017.1316681 © 2017 Informa UK Limited, trading as Taylor & Francis Group controversial claim and evaluate its truth using publicly available data and the opinions of experts. StopFake’s mission was to analyze a large volume of information and only publish what they could prove false. If the claim seemed untruthful, but was impossible to prove, or appeared to be partially correct, StopFake remained silent. While StopFake appropriated the cultural authority of fact checking, and appealed to the Western concepts of journalistic objectivity and civil society which underpin it, its activist character and decision to publish only debunkings, rather than evaluations, set it apart from Western models such as PolitiFact (Stencel 2016; Graves 2016a; Graves and Konieczna 2015). Several authors examined StopFake’s activity as a case of volunteer fact checking and an information resistance project (Bonch-Osmolovskaya 2015; Cottiero et al. 2015; Pomerantsev 2015; Khaldarova and Pantti 2016). We, in contrast, frame our study of StopFake within the broader study of online news practices and of fact-checking work. The monitoring and evaluation activities conducted by StopFake.org have a lot in common with those practiced by other modern journalists (Boczkowski 2010). Between the spring of 2014 and the fall of 2015, our period of observation, StopFake’s work practices functioned via email and social media. A rotating core of 12 people in Kiev, including journalistic, editorial, and technical staff, coordinated work. A larger international network of online volunteers submitted stories for evaluation, provided translation services, and worked collaboratively to locate counter evidence. In addition to changing practices in newsrooms around the world, the internet also allows for citizen participation in news reporting. Modern online infrastructure, such as inexpensive Web hosting and open source content management systems, makes it much easier for volunteers to collaborate online and to produce a professional-looking website. Allan (2006, 52) notes that with the internet, “a multitude of users could harness the power of distributed information to connect with one another in meaningful dialogue,” leading to citizen involvement in reporting on events including Hurricane Katrina and the London bombings. Reese et al. (2007, 276) found a complementary relationship between mainstream journalists and citizen media. Indeed, they note “the blogosphere weaves together citizen and professional voices in a way that extends the public sphere beyond the boundaries policed by the traditional news media.” This was the case with StopFake, a project that combined journalists trained at the leading journalism school in Ukraine, concerned citizens located around the world with needed language or technical skills, and content both accessed through, and disproved by information found on, the internet. Fact Checking as a Social Process StopFake’s distinctive contribution has been to adapt the idea of “fact checking” as a tool to counter a concerted foreign propaganda campaign. Fact-checking organizations usually evaluate isolated claims made by domestic politicians on behalf of domestic audiences. StopFake has adapted the technique to challenge a state-sponsored campaign of systematic misrepresentation, providing its results to Russian and Western audiences as well as domestic consumers. Fact checking in the United States rests on an assumption that the public will trust journalistic objectivity, but challenges the idea that journalists should report rival claims without evaluating them. For decades, Western journalists argued their work revolved around core ethical values, at the center of which was the value of objectivity, to be “free from values and ideology” (Gans 2004, 182). The Society of Professional Journalists STOPPING FAKE NEWS 2063 (2014) insists journalists should endeavor to be “accurate and fair” and that they should report the truth, verifying information before using it in a story. However, in practice the Western journalistic commitment to “objectivity” has often gone along with a reluctance to take sides in reporting controversial topics, instead reporting the arguments made by each side. Fact checking evolved from a tool of amateur bloggers into an established part of professional journalism embodying “distinct elements of ‘accountability’ journalism as well as ‘explanatory’ or ‘service’ journalism” (Graves 2016b, 95). According to Graves, “objectivity as conventionally practiced resists making factual challenges to official claims” (215). American fact checkers, such as PolitiFact, typically take a claim from a political speech or opinion piece and ask academic experts to evaluate it. The result is summarized with a ranking. While “True” and “False” are options, claims are often ranked as “Mostly True” or “Mostly False” and, on occasion, as “Pants on Fire” (Graves 2016b). The precise rating often hinges on analysis of intent. A claim might be technically true but misleadingly presented, or true only according to an unconventional measure of economic growth. This form of fact checking relies on broader institutions of liberal Western democracy that are not fully developed within Ukraine: journalists critique the claims of a particular politician within the context of ostensibly apolitical expert opinion. StopFake, despite adopting the identity of “fact checking,” is doing something rather different. This reflects a difference between the kinds of claims evaluated by PolitiFact and those evaluated by StopFake. American fact checking was designed to keep politicians honest, not to counter the systematic and coordinated work of a state-backed propaganda machine. PolitiFact focuses on specific political claims, but assumes that the journalists reporting them are doing so accurately and honestly. In contrast, StopFake evaluates the work of journalists, looking for misleading stories based on fabricated evidence. Its volunteers stress that they work only on “facts” and pay no attention to opinions, a contrast with the PolitiFact approach of canvassing expert opinions. StopFake, rather than rating some claims as true and others false, never posted items the team could not disprove. Each evaluation posted on the site has a title beginning with the word “Fake.” While PolitiFact focuses on interpretation, more basic questions generally concerned StopFake: Was this photograph taken when and where the story claims? Is the person quoted identified correctly? Does the story mistranslate or misrepresent information in the source on which it claims to be based? During the Cold War, philosopher Karl Popper put the idea of the “open society” at the heart of the Western liberal struggle against totalitarianism, linking political openness to scientific inquiry (Popper 2013). Popper’s parallel work in the philosophy of science disputed the earlier assumption that experiments could prove scientific claims correct, but insisted that any scientific claim could potentially be disproved if a counter example was found, a process he dubbed “falsification.” StopFake engaged in a similar project: the systematic testing and falsification of claims made in news reports. The techniques it uses cannot prove a story true but might, in the view of StopFake, prove it fake. This insistence that responsible journalists work only with “facts” may nevertheless startle readers who question whether facts ever can be fully separated from opinions, or doubt that anything can be unequivocally proven. Michael Schudson has shown that journalistic objectivity is a relatively recent invention. Schudson (2001, 149) notes that “‘Objectivity’ is at once a moral ideal, a set of reporting and editing practices, and an observable pattern of news writing.” As Schudson (2003) explains, reporters give meaning to the 2064 MARIA HAIGH ET AL. facts by constructing a narrative. This narrative can assist readers to inform themselves, but it can, at moments, deliberately or unintentionally mislead or misinform. The rise of explicitly partisan, entertainment-oriented news outlets such as Fox News may suggest that this attachment to objectivity has run its course. After the 2016 US presidential election, references to a “post-fact era” have become commonplace. Scholars have shown that journalists reproduce the political, economic, and social viewpoints of the community in which they live. Indeed, Herman and Chomsky (1988, 305) argue that while journalists view themselves as objectively reporting events, they and the organizations for which they work, perform a “system-supportive propaganda function” due to the corporate ownership of news media, the reliance on government and corporate information as sources, national values, as well as the advertising-supported nature of media in the West. As such, information passes through successive filters, leaving only a “cleansed residue” on the pages of the morning newspaper (Herman and Chomsky 1988). McChesney (2012, 683) agrees, arguing that government and corporate sources limit the range of “legitimate debate.” From this viewpoint there might seem to be little to choose between the coverage of Ukraine provided by Russia Today, on the one hand, and by the BBC or New York Times, on the other. After all, most Soviet propaganda tactics were used at times by, or even originated in, Western countries, though they took on a different character as part of a systematic and totalitarian state apparatus (Kenez 1985, 251–252). Boyd-Barrett (2015) examines Western narratives around the Ukraine crisis. He argues that the Western media was itself engaged in propaganda work, by framing the dispute in ways that supported the political and foreign policy goals of the Western powers. Noting systematic divergences between narratives favored by Western journalism and those propagated in the Russian media (and in some “alternative” Western publications), he asserts that their clash inevitably tends towards the destabilization of the hegemonic Western discourse, not in the sense that it entitles an analyst to declare what is “true” or “false,” but in the sense of being able to detect the play of ideology amidst apparent contradiction, paradox and hypocrisy. A scholar following Boyd-Barrett, determined to deny either side the privileged position of objectivity or moral superiority, might admit no distinction between the StopFake volunteers, working to advance one set of geopolitical interests, and their sisters in the Russian troll factories (discussed below) who serve another. We take a different perspective. Instead of looking broadly at two sides in a war, and thus characterizing national media practices in the manner of, for example, Nygren et al. (2016), our interest is in the phenomenon of fake news distribution and in the work practices of a specific group attempting to fight its spread. Recognizing that both kinds of information work are driven by geopolitical concerns does not, in itself, commit us to the conclusion that their products are equivalent. Neither do we assume that the work of the Russian troll factories is propaganda, and therefore untrue, and that the work of StopFake is virtuous counter propaganda, and therefore true. Instead, we offer a micro-level study of information work practices grounded in science studies and ethnographic methods. The trolls reportedly work quickly to fill a quota and produce output that betrays little concern for truth, or even plausibility, whereas we will show below that the work of StopFake proceeded carefully and centered on a hunt for evidence fulfilling specific and quite rigorous criteria. STOPPING FAKE NEWS 2065 The same questions of truth and objectivity are central to science studies, a field in which scholars have devised epistemological frameworks to acknowledge the obvious fact that knowledge is socially constructed without adopting a position of extreme relativism in which truth claims are equally valid. As Graves (2016c) has observed, “fact checkers, investigative journalists and scientists [all deal] with controversies in which not just facts but rules for determining them are in question.” Fact checking, like science, “affords a view of the way material, social, and discursive contexts structure factual inquiry” (3). Traditional positivist philosophers of science assumed that objective truth existed and could be discovered by following a scientific method, which we might compare to the rhetoric of professional journalists. As a reaction to this, the radical philosopher Paul Feyerabend insisted that there was no scientific method for the production of truth, that objectivity was an illusion, and that therefore all forms of knowledge were equivalent (Feyerabend 1975). We might compare this to the position of Boyd-Barrett. Later, more socially oriented scholars distrusted both positions. They reinterpreted scientific truth claims as the product of social processes, studying work processes ethnographically (Latour and Woolgar 1979). They historicized the invention of scientific truth (Shapin 1994) just as Schudson (2003) historicized the invention of journalistic objectivity. They rejected the assumption that belief in claims we agree with should be explained through appeals to the natural world whereas belief claims we disagree with should be explained causally, instead arguing for a “symmetrical” approach in which all beliefs were explained causally (Bloor 1991). In the same way, the commonalities between the information work of propaganda and conventional journalism, and between the work of trolling and online counter propaganda, would benefit from a symmetrical examination grounded in social practices. Yet most in the science studies field would not conflate the idea that science is a set of social processes with the idea that all possible claims or beliefs are interchangeable. By looking at how claims are constructed, and buttressed against challenges, science studies moved beyond simply arguing that all truth claims are socially constructed, and so serve social interests, to look more deeply at the ways in which different social processes produce truth claims with different characteristics. For example, the specific social processes by which scientific truth claims are constructed give them a different relationship to the natural world than those put forward by religious or political authorities (Latour 1988). Latour documented the enormous care taken by scientists to build networks of human and non-human resources, such as laboratory equipment, publications, and experimental results, to bolster their published claims. Scientists construct these evidential chains with possible challenges in mind, reinforcing their weak spots in their narratives to survive attacks from their peers. Journalists likewise construct published narratives, working with constraints mandating the collection, evaluation, and presentation of particular kinds of evidence. This does not make their work products unbiased or objective, but it does allow them to survive simple challenges of the kind that StopFake applies when screening for fake news. Studying StopFake Our research employs mixed qualitative methods to gather data about StopFake.org’s work processes and products. We were interested in studying how StopFake.org conducts its information work, conducting detailed study and analysis of this single 2066 MARIA HAIGH ET AL. organization (Cresswell 1998) and how the group made editorial decisions. The data presented here were compiled over 18 months, from the spring of 2014 until August 2015, using a variety of approaches. First, and earliest, we read the English- and Russian-language websites of the organization closely for data about the untrue stories, or “fakes,” that StopFake.org chose to debunk. This elicited a stream of stories that the group identified as fake, but did not explain how and why the group chose these stories over others. For that we needed informants inside the organization. In late March 2015, two members of StopFake.org traveled to the United States. We participated in public events with the two StopFake.org editors and carried out extensive unstructured interviews with them. StopFake.org, whose volunteers were, during our period of observation, spread across Ukraine and around the world, conducted much of its work virtually through email and social media. As such, traditional ethnographic approaches, where the researcher physically participates in a group or community, were not possible. Therefore, we employed the more recent qualitative research methodology of internet-based ethnography to analyze the information work practices of StopFake.org participants online. Markham (2003, 52) states that these research practices allow researchers to “study cultural phenomena mediated through Internet-related technologies for communication.” One of the editors provided us with access to the organization’s closed Facebook group. We closely read the text of StopFake.org’s internal material from its closed Facebook group to determine the work flow amongst the organization’s volunteers and how StopFake.org selected different stories to rebut. Data gathered from the public StopFake.org sites, the private Facebook group, the public presentations of the two editors, and the unstructured, face-to-face interviews was triangulated to increase the reliability of the findings. We then coded all Englishlanguage fake reports issued by the group in its first 18 months to give a quantitative perspective on the kinds of evidence used to declare news fake. This mixture of methods allows us to combine material from online and “real”-world settings to further our understanding of the work practices of StopFake.org participants. Countering Peer-to-peer Propaganda StopFake is a young organization, founded in March 2014 by recent graduates of the National University of Kyiv Mohyla Academy’s journalism school. In interviews with us and in their public presentations, StopFake’s founding volunteers presented their reaction to Russian coverage of the occupation as the collision of idealistic, Western-influenced journalistic professionalism with cynical Russian propaganda. For example, in February 2014 thousands of soldiers, wearing green uniforms unmistakably similar to those of the Russian military, surrounded key sites around Crimea such as airports and government buildings. Russian officials claimed that they were spontaneously organized groups of local citizens. Russian television channels, widely distributed within Ukraine, likewise insisted that the “green men” were not Russians (Shevchenko 2014). Only after Russia annexed Crimea did its President, Vladimir Putin, admit that the green men had been Russian special forces (RT.com 2014). StopFake’s founders describe being shocked by these blatant departures from the norms of journalistic practice they had been taught. Their narrative stresses their naïvety: the StopFake editors at first complained to the Russian journalists and media organizations involved, sending evidence in the hope of securing retractions. Prior to the conflict, Russian STOPPING FAKE NEWS 2067 media were widely disseminated in Ukraine, in part as a result of investments made in Ukraine by Russian media groups (Szostek 2014). The Russian media ignored these requests, and the organizing team came together via a closed Facebook group. They registered the StopFake.org domain on March 2, 2014 and put up an online form for readers to submit dubious news stories for evaluation. Origin narratives tend to be somewhat stylized versions of reality, but we observed that StopFake’s most active volunteers were indeed recent journalism school graduates, whose impression of journalistic norms came more from textbooks and inspiring professors than from prolonged immersion in news organizations. Their commitment to upholding journalistic objectivity, through the defense of facts in the face of lies, is a commitment to an imagined version of Western media practice. Indeed, we were startled that our informants did not seem to be aware that some American media have more than a little in common with Russia Today. StopFake’s organizational structure and information work practices were created rapidly, and in response to a very specific set of media practices. Its counter-propaganda mission is, in some ways, quite novel and in others entirely familiar. A new kind of propaganda offensive gave rise to a new kind of counter propaganda, both reliant on social media. So to understand StopFake we must first explore what was, and was not, new about the Russian information offensive launched in 2014. A century ago states were already funding propaganda campaigns as an integral part of their war-fighting missions. Kenez, in his definitive work on Soviet propaganda, The Birth of the Propaganda State, concluded that propaganda was seamlessly integrated within the larger Soviet system (Kenez 1985). Totalitarian governments traditionally exercised direct control over national media. Government officials created press releases, gave briefings, approved news stories, and issued posters, films, and books supporting the state’s position (Kenez 1985; Pipes 1995). Soviet journalists served an ideological state mission, while at the same time developing their own journalistic practices and following their own agendas (Wolfe 2005). Soviet media consumers likewise developed their own methods of deciphering its content (Mickiewicz 2008). Russia has made a similar investment in mass media propaganda to steer public sentiment in one direction or another. Petrov, Lipman, and Hale (2014) have stressed the extent to which Russia’s government remains reliant on high domestic approval to retain its legitimacy. For 17 months, including our entire period of observation, Ukraine dominated the news headlines on Russian television and government allied newspapers. Saturation coverage lifted only in October 2015, when attention shifted to Syria to prepare public opinion for Russia’s intervention there (BBC 2015). The general content of Russian propaganda still appears to follow a top-down method, as talking points set by officials close to Putin are propagated almost instantly throughout the Russian mediasphere (AP 2013). Putin’s allies exert enormous influence over the Russian media and control all Russian television networks. Russia’s government still tolerates some independent voices in niche media such as the Ekho Moskvy radio station, Novaya Gazeta alternative newspaper, and online video producer Telekanal Dozhd. Overall, however, Reporters Without Borders ranked Russia as only 148 out of 180 countries assessed for its World Press Freedom Index for 2016. Putin does not direct a state machine with anything like the bureaucratic ubiquity of the old Communist Party. Indeed, Roudakova (2017) argues that Soviet journalism took the idea of truth telling seriously, despite interpreting this in a distinctive way and applying it selectively. In her account, the real crisis in journalistic authority exploited by Putin’s regime 2068 MARIA HAIGH ET AL. came with the dismantling of this system in the 1990s. The production and dissemination of propaganda to support its talking points has been distributed. Russia’s governing elite employs large numbers of internet trolls, who set up social media accounts in which they place ordinary-seeming comments and personal news, punctuated by streams in which they express the talking points of the day in their own words (Pomerantsev 2013; Chen 2015).1 Others then view and share these posts and unknowingly contribute their own social capital to the spread of propaganda. As Mejias and Vokuev (2017) note, “Citizens themselves actively participate in their own disenfranchisement by using social media to generate, consume or distribute false information.” We call this “peer-to-peer propaganda,” and argue that this reliance on trolls and bloggers changes the way propaganda is experienced, and the options available to counter it. Ordinary people experience the propaganda posts as something shared by their own trusted friends, perhaps with comments or angry reactions, shaping their own opinions and assumptions. The challenge of combating enemy propaganda, often called “counter propaganda,” is as old as propaganda itself (Hall 1976; Risso 2007). Counter propaganda, the work of government employees or closely supervised contractors, pursued with radio broadcasts into enemy territory, the printing of materials disputing enemy claims, and the airdrop of leaflets. Yet today this work is being done by volunteers more effectively than by governments, in part because the propaganda being countered is spread as much by social media and by seemingly independent media outlets as by obviously state-controlled media. Propaganda spread online, by trusted friends or by fake accounts, is most obviously countered by a rival social media operation and so StopFake relies largely on social media for the dissemination of its material. The column on the right of each page of its website features large tile icons to follow its output on various platforms, including Facebook, Google Plus, Pinterest, Twitter, VKontakte, and YouTube. The icons also display the number of followers on each platform, which as of November 2016 totaled more than 179,000. A bar at the bottom of each story allows one-click sharing to social media. StopFake also attempts to give ordinary Ukrainians the skills to consume media more critically. From the beginning of 2015, information literacy, media literacy workshops, and lectures offered in collaboration with IREX Ukraine, The Ukrainian Media Partnership Program, became an essential part of StopFake’s work (MediaLiteracy 2015). This tackles a problem that extends far beyond Ukraine. A widely reported study discovered that even Stanford University students, elite members of the generation often called “digital natives” because of their presumed skill in the online environment, had little ability to detect fake stories and would assess online reports purely from internal content rather than seeking external verification (Stanford History Education Group 2016). Trolls Without Borders StopFake is a multilingual site. Its stories have consistently been published in two languages: Russian and English. This reflects the multiple constituencies targeted by the Russian propaganda offensive. Russian-language propaganda reached both Ukrainians and Russians as Russian is the most widely spoken language in Ukraine. Prior to the war, Russian television channels were among the most popular in Ukraine. Russian social media platforms and other Web outlets are similarly popular in Ukraine, and many Ukrainians have friends and family on the Russian side of the border. When Russian agents STOPPING FAKE NEWS 2069 began their takeover of Eastern Ukraine and Crimea, one of their first actions was to seize control of television transmitters, replacing Ukrainian channels with state-controlled Russian channels (Nemtsov 2015). When StopFake.org was originally launched, most visits came from inside Ukraine and about a quarter from Russia. Table 1, an August 7, 2015 snapshot from Google Analytics, shows that by mid-2015 almost half of all visitors came from Russia and that the site was one of the 10,000 most frequently visited by Russian internet users. The prevalence of Russian visitors suggests that StopFake is having some success in reaching those most exposed to Russian propaganda. The Russian-language content on StopFake has always been the most popular, but the editors chose English because of its huge global reach. While the Russian government’s propaganda campaign has followed the classic effort to shape public opinion, culture, and perceived reality within the confines of a nation state (Barghoorn 1964; Ellul 1973; Kenez 1985; Pipes 1995; Herf 2006), it has also, due to the internet and to the deregulation of Western media ownership, been aggressively projected by Russia into the Western media sphere (Fredheim 2015). The state-owned Russia Today channel, carried widely in Western countries (Cohen 2014) as a result of subsidies provided by Moscow to cable and satellite operators (Zavadski 2015), mimics the form but not the journalistic practices of conventional news channels such as CNN. Russia’s government has adopted newer media technologies, including the creation of Sputniknews.com in late 2014, which reports in 13 languages and claims to “point the way to a multipolar world that respects every country’s national interests, culture, history and traditions” (Sputnik International 2015). In the West, the object of the Russian campaign has been as much to create the appearance of uncertainty as to convince its targets of the complete truth of the Russian narrative (Pomerantsev 2014). Snyder (2014) observed that Russian propaganda about the Ukrainian crisis has employed two effective frames, first that the Ukrainian revolutionaries were fascists and second that the Ukrainian crisis was a geopolitical struggle between Russia and the United States. For example, Putin himself has repeatedly referred to Ukraine’s army as a “foreign legion,” to support the idea that it serves the interests of Western powers rather than those of Ukrainians (Gaufman 2015; RT.com 2015). Here too, social media has played an important part. Research has shown that online comments play a crucial role in determining readers’ responses to online stories (Kareklas, Muehling, and Weber 2015). English-speaking trolls crowd out reasoned discussion in the comments sections of articles on Ukraine posted by Western media sites such as The Guardian (Seddon 2014). Shifting public opinion would deter Western governments from intervening in the conflict. TABLE 1 StopFake visitors by country (selected) for August 2015 Country Visitors (%) Rank in country Russia 49.2 9555 Ukraine 22.4 3877 Belarus 3.4 9700 United States 3.4 318,007 2070 MARIA HAIGH ET AL. StopFake was an immediate international success, attracting thousands of daily visitors. Its work was quickly recognized in Ukraine and in the West, including coverage in the mainstream press as well as academic journalism (Chimbelu 2014; Tomkiw 2014; Haynes 2015; Pomerantsev 2015; Van der Schueren 2015). Traffic to the English-language pages was more variable than to the Russian edition, spiking when a story went viral or when a foreign news service profiled StopFake. Other languages, including French, Spanish, and even Esperanto, have come and gone with the enthusiasm of volunteer translators. After the initial surge of interest around Russia’s occupation of Crimea, traffic has ebbed and flowed along with international coverage of the Ukraine crisis. Spikes in traffic visible in Figure 1 occurred as Russian-backed forces seized control of much of Eastern Ukraine in the late spring of 2014, when a government offensive recaptured most of this territory that summer, and when Russian troops intervened to impose a stalemate. A series of ceasefire agreements reduced the intensity of fighting for much of 2015, after which the conflict slipped from international headlines and the number of visitors to StopFake.org stabilized at a lower level, reflecting the “fatigue” that sets in during coverage of any long-running news story (Hoskins and McLouglin 2010). StopFake Editorial Processes The online form for submission of “fake” stories initially garnered approximately 200 submissions a day, some from traditional media websites, others from social media. Web and social media exposure brought more volunteers to the Facebook page to help with the project. Links to sources claiming to refute information in the putative fake story accompanied some submissions. In its early days, a loosely structured StopFake network of approximately 40 volunteers worked on identifying potential “fakes.” Each would pick one or two claims to further investigate, sharing their evidence to see if others agreed that the original story had been unambiguously disproved. As its work practices developed, editors increasingly identified stories by systematic media monitoring rather than relying on submissions from the website. Russian media sources routinely monitored by StopFake included NTV (НТВ), Вести (Vesti), РИА Новости (RIA News), Русская Весна (Russian Spring), Новороссия (Novorossiya), Антифашист (Antifascist), Украина.Ру (Ukraina.ru), and Звезда (Zvezda). The team also monitored FIGURE 1 Daily numbers of sessions from early 2014 to mid-2015, as charted by Google Analytics STOPPING FAKE NEWS 2071 selected Ukrainian media outlets, including Channel 1+1, Inter TV Channel, and Channel 5 (which is owned by Ukraine’s current president and was critical of the former government). We found that the work of StopFake editors had many parallels with ordinary journalistic practices in the online era as described by Boczkowski (2010). The modern journalist is constantly scanning the media environment. Boczkowski explored the information work of two Argentinian newsrooms, where journalists continually monitored the product, including the websites, blogs, newspapers, and television news, of their competitors. News outlets wanted to ensure that they had all the important stories covered so that readers would not have a reason to go elsewhere. Monitoring by journalists thus led to practices of imitation amongst the news outlets, leading to the standardization of news content across the outlets. Phillips (2010) similarly found that British newspapers imitate the copy of their competitors, but also reuse the copy of other news organizations without attribution. Phillips notes: Today news can be immediately “scraped” off the site of a rival and re-organized a little. The intensity of competition on the Internet, coupled with the lack of technical or temporal barriers to making use of information lifted from elsewhere, means that it is difficult for any news organization to retain exclusivity for more than a few minutes. (Phillips 2010, 375) StopFake’s journalists carried out the same process of scanning, but instead of trusting and imitating each new story they ignore stories that appear to be true or they cannot definitively identify as fake. There is an inherent asymmetry between this activity and the process of rapid imitation described by Boczkowski. Evaluating takes more work than imitating, while disseminating is quicker and surer than disrupting. Initially all the work for the StopFake project was done after volunteers’ work hours and on the weekends. As in many volunteer organizations, some participants dropped out over time while others increased their commitment. One month into the project, one journalist quit her job and lived on her savings for nearly a year while fully devoting herself to the project. The remaining volunteers developed a more rigid division of labor, with the most active serving as editors. By mid-2014, a core group of six people, known as editors, had the rights to post content to the website and to edit existing pages. StopFake’s only budget came from online donations averaging around $200 a month. The network of volunteers included another group producing weekly video digests, two translators for the English version of the site, one Englishspeaking editor, and two server administrators, one in Ukraine and the other in the United States. Sometimes the process of reviewing a story and publishing a rebuttal went very quickly. StopFake’s internal archive shows that on December 7, 2014 one of the editors had discovered what claimed to be a “secret memo” from Ukrainian Security Services with the instructions for conducting secret operations in Donbas to undermine civilian support for the separatists on Russkaya Vesna. She shared it with the group. Another volunteer examined it and noticed right away that someone unfamiliar with the Ukrainian language had penned the memo as even the letterhead for the Security Services was misspelled. The same day, StopFake volunteers uploaded the story on the site. When StopFake volunteers did not find evidence that met its criteria, they usually would not post the story. StopFake editors uploaded new stories to the website at most once a day. During our period of observation, only the chief editor did this, after consultation with other editors to make sure that the story was ready and met the group’s 2072 MARIA HAIGH ET AL. standards. A review of the elapsed time between publication of a fake news story and publication of StopFake’s response shows that it rose significantly from an average 1.6 days over the first three months of the site’s operation to a peak of 5.4 days in the three months from December 2014 to February 2015. This reflects both a drop in the number and energy level of volunteers and the more stringent editorial process that evolved over the group’s first year. By the June to August 2015 period this had dropped back to an average 3.2 days (Table 2). Most stories involved several volunteers working together to locate and evaluate a fake story. On August 10, 2015, Ukrainian social media outlets published a testimony with a photo of a Ukrainian mother narrating how Russian skinheads brutally beat her son in Almaty, Kazakhstan. A StopFake volunteer noticed social media comments suggesting that the photo was taken in Moscow instead of Kazakhstan. This turned out not to be the case, but another volunteer was able to identify the picture as previously published in Brazil in October 2012 with the heading “Skinhead Hitler fans are arrested after fight in S[ao] P[aulo].”2 The refutation appeared on the StopFake.org website on August 14, 2015.3 Some fakes took longer to progress through the editorial process, giving the original story longer to spread unchallenged. For example, on July 31, 2015, a story spread in Russian news outlets and social media that Romanians in the Western Ukraine region of Bukovyna were demanding independence. This implied, inaccurately, that separatist tensions were rising across Ukraine. A Ukrainian newspaper reprinted the story within that region. The story appeared on the site as a debunked fake on August 8.4 During our period of observation, StopFake did not have the resources to investigate large numbers of stories in depth. During a typical week in mid-2015, the StopFake team might examine around 250 potentially dubious stories but post only five articles uncovering fakes. However, these articles would usually rebut dozens of dubious stories as Russian television, state-controlled newspapers, blogs, and social media repeated a single claim with minor variations (Table 3). The group organized its work processes via closed Facebook group, email, and file exchange spaces. Participants used Skype and email to discuss daily operations. StopFake had no office, conference room, or regular meetings. While the most active members were based in Kiev and knew each other, communication took place primarily online. As one of the StopFake editors said to us: the “full StopFake team has never been in the same room at the same time.” TABLE 2 Fake reports published by three-month period, with average processing time from appearance of fake story Period Dates Reports published Average processing time (days) 1 March to May 2014 122 1.6 2 June to August 2014 148 2.5 3 September to November 2014 65 5.1 4 December 2014 to February 2015 69 5.4 5 March to May 2015 68 5.4 6 June to August 2015 67 3.2 Total 539 STOPPING FAKE NEWS 2073 StopFake’s choice of Facebook as a social media platform rather than VKontakte, which as the most popular Russian-language social media platform had about 10 times more users in the countries of the former Soviet Union, reflected its distrust of the security offered by Russian services. VKontakte’s founder, Pavel Durov, had encouraged use of the site as a platform for organizing social protests in Russia (Toler 2015). By 2014, however, state-aligned interests acquired control of the company and they forced Durov out (Scott 2014). This was part of a broader push by Russia to establish control over Web and social media networks, including a law requiring foreign services to serve Russian users only from servers located within Russia and hence within the reach of its security service. TABLE 3 Internal StopFake work sheet for July 20, 2015 Source Stories reviewed Potential fakes Number of potential fakes LifeNews 12 http://lifenews.ru/news/157657 1 Russia Today (Russian) 42 http://russian.rt.com/article/104536; http:// russian.rt.com/article/104425; http:// russian.rt.com/article/104346; http:// russian.rt.com/article/104283 2 Russia Today 8 0 Вести 23 http://www.vesti.ru/doc.html?id= 2643614&tid=105474; http://www.vesti. ru/doc.html?id=2643198&tid=105474; http://www.vesti.ru/doc.html?id= 2643437&tid=105474 3 Первый канал 15 0 Звезда 22 http://tvzvezda.ru/news/vstrane_i_mire/ content/201507200441-gbic.htm; http:// tvzvezda.ru/news/vstrane_i_mire/content/ 201507201812-ian4.htm 2 РИА новости 36 0 Спутник 15 http://sputniknews.com/military/20150720/ 1024816922.html 1 НТВ 12 http://www.ntv.ru/novosti/1446176 1 Украина.ру 12 http://ukraina.ru/news/20150721/ 1013716958.html; http://ukraina.ru/news/ 20150721/1013709894.html; http:// ukraina.ru/news/20150720/1013709542. html 3 ТАСС 42 0 Ньюзфронт 15 0 Новоросинформ 22 http://www.novorosinform.org/news/id/ 33046; http://www.novorosinform.org/ news/id/33026: http://www. novorosinform.org/news/id/32981 3 Правда.ру 5 0 Рус весна 20 http://rusnext.ru/news/1437390519; http:// rusnext.ru/news/1437472977: http:// rusnext.ru/news/1437470051; http:// rusnext.ru/news/1437393468 2 Ren.tv 21 0 Total 322 18 2074 MARIA HAIGH ET AL. Translation of StopFake articles, primarily from Russian into English, took place only after the editorial process was finished. This involved translating any quoted source material as well as the StopFake analysis. Volunteer translators saw a high turnover rate as the work is demanding and ongoing. As only finished articles were translated this work was easy to decouple from the other editorial processes, and volunteers living in Canada, the United Kingdom, Italy, and Russia carried out translations. Methods of Establishing a “Fake” StopFake reports follow the same format as American fact checkers, by presenting a claim and then summarizing the investigative process and evidence gathered while evaluating it. Each published rebuttal thus presents at least one kind of evidence. We used these published reports to categorize the debunking evidence used by StopFake into nine categories. Figure 2 lists these, and categorizes their appearance in the 539 refutations published by the group during its first 18 months, from March 2014 to August 2015. This suggests that the two methods closest to those favored by American fact-checking groups were relatively infrequent. As Graves (2016c) showed in an ethnographic study, American fact checkers rely heavily on interviews with experts. Science studies scholars are well aware that fact cannot be distinguished from fiction without placing trust in certain authorities or social practices (Latour and Woolgar 1979). The category “expert evaluation of controversial claim,” in which the group cited expert opinions, covered 11 percent of postings. Graves (2016b) referred to trust in official numbers, as distinct from claims by politicians, an example of the preference of American fact checkers for “institutional facts.” Such statistics appeared rarely (around 2 percent of the time) in the first nine months of FIGURE 2 Proportion of 539 StopFake posts (English-language edition, May 2014 to August 2015) employing each method of documenting a news story as fake STOPPING FAKE NEWS 2075 StopFake reports and vanished completely thereafter. StopFake rarely relied on official government statistical publications as evidence that certain claims could not be true. This reflected differences in the kinds of claim being evaluated, as fake news typically referred to individuals or specific incidents whereas political claims usually concern more abstract entities such as the economy. However, statements from government sources, for example asking mayors to confirm or deny stories about events in their cities, were somewhat more common, appearing as evidence in 14 percent of postings. We found very few instances in which StopFake relied directly on media reports, whether from Western or Ukrainian sources, as evidence that a contradictory claim must be false. StopFake volunteers, as younger independent-minded journalists, tended to be skeptical of news reported on established Ukrainian channels, owned by various oligarchs with their own political agendas. We observed that they appeared more likely to conditionally trust reports of the upstart non-profit Hromadske internet television news channel, staffed by people of a similar age, background, and mindset. The four most frequently used methods all involved systematic evaluation of the candidate piece of fake news and its constituent images, facts, and quotations for signs that they had been manipulated or misappropriated. During Soviet times the KGB would invest considerable time in producing professionally faked stories to plant in Western media, complete with plausible supporting documentation. A prime example of this was the 1980s effort to spread the story that the US military had created AIDS (Mikkelson 2013). Modern Russian misinformation campaigns have a distinctive amateurishness, reflecting the new reliance on peer-to-peer propaganda. The focus is on quantity rather than quality, as befits a contemporary media landscape dominated by listicles, teasing headlines, and other clickbait. As a result, StopFake found some fake stories surprisingly easy to refute. Many pieces of fake news about Ukraine hinged on pictures sourced from social media, allegedly showing various atrocities carried out by Ukrainian forces. StopFake typically debunked these using elementary techniques of digital forensics. Google allows searches by image, to find other versions of a particular image file. Various tools can be used to explore the metadata embedded in pictures, including the date and time on which they were taken and, from some cameras, the GPS coordinates. This metadata frequently revealed that the picture could not possibly show what the fake news story claimed, having been taken elsewhere or before the relevant time. StopFake hosted a tutorial page on these techniques, to encourage its readers to be more cautious media consumers.5 Thirty-five percent of the StopFake postings relied on locating the original source and context of misidentified images. For example, numerous StopFake stories identified pictures and video from the civil war in Syria, the Bosnian conflict, or Mexican drug violence reused by Russian sources as evidence of outrageous Ukrainian aggression in Donetsk. Photographs of dead children have been particularly prone to misappropriation. Social media users circulated a picture from the filming of a Russian horror film as evidence of cannibalism by the Ukrainian army. StopFake frequently finds that pictures claiming to show unrest or chaos in various towns outside the conflict zone were taken in other cities. One example: a picture circulated by Russian media as a Ukrainian martyr who suicide bombed a government tank had been spread from a Vkontakte account but actually originated on the Facebook page of a Russian woman who remained alive and well (Capron 2014). 2076 MARIA HAIGH ET AL. Locating the original source of an image sometimes revealed manipulation with Photoshop or other image-editing software. This was documented in 10 percent of postings. One StopFake story showed that the swastika visible on a Ukrainian personnel carrier, in a picture widely distributed on Russian social media, was not present on the original Reuters website version of the image.6 Another showed that an image of a small girl, holding a handwritten sign reading “we want war,” had been doctored to remove the word “don’t.” A Russian television news report centered on a proposed Ukrainian banknote incorporating a portrait of Hitler was rebutted via image analysis, showing that the image used in the report was a manipulated version of a 2008 design featuring a Ukrainian writer. Other stories required more in-depth investigation, using more traditional journalistic techniques. One of these was to locate the original interviews or documents from which the suspected fake report used extracts. Propaganda stories based on misrepresentation generally do not link back to the original sources being distorted, so the absence of such links made StopFake flag a story for investigation. This involved contacting the sources quoted in the article or identifying the original documents or videos on which they are based. The broad category of “original source does not hold claimed information” applied to 39 percent of StopFake reports. Many of these fake news stories specifically involved misleadingly edited or contextualized quotes, which were highlighted in 26 percent of the StopFake reports. For example, Russia Today broadcast what it claimed was an interview with the Chief Rabbi of a Kiev synagogue, calling on his followers to emigrate because of rising anti-Semitic violence. StopFake found that the interviewee was the Chief Rabbi of Simferopol, Crimea, describing conditions under Russian occupation.7 This was also an example of the specific practice of wrongly identifying individuals featured in news stories, which were documented to identify news stories as fake in 15 percent of StopFake reports. Certain kinds of acknowledged media sources would also alert suspicions. For example, volunteers identified several English-language sites on which material supporting Russian propaganda campaigns frequently appeared. Russian-language media then linked to these reports, misrepresenting them as credible evidence of Western journalistic consensus. One of these was the European Union Times, whose plausibly official name belied its fondness for conspiracy theories of all kinds. Looking at the Domain Name System records, reported ownership, links, and history of suspect source domains helped StopFake to evaluate their credibility. Some of StopFake’s counter-propaganda work relied on exploiting internal inconsistencies in evidence. For example, Russian state media claimed that American Stinger antiaircraft were found in a facility at Luhansk airport after it was overrun by Russian-backed forces. StopFake’s report documented several misspellings in the text on the weapons, which matched the virtual model of the Stinger used in a popular video game but not the real-life version. In this case the report summarized and disseminated evidence gathered by other communities, such as a Reddit group.8 We found that 9 percent of StopFake stories relied in this way on debunkings already performed by other groups. Proving that an event did not take place is inherently challenging, so stories sourced to eyewitness testimony were hard to evaluate with the resources available to StopFake. One exception came in July 2014 when Russian media, including Russia Today, disseminated one of the most notorious pieces of fake news of the entire campaign. Ukrainian forces had recently recaptured Sloviansk, the military headquarters of the “Donetsk People’s Republic” established by Igor Girkin, a Russian colonel. In a lengthy interview, STOPPING FAKE NEWS 2077 Galina Pyshniak tearfully described the crucifixion of a small boy in the town’s central square by the Ukrainian army, after which his mother was dragged through the streets behind a tank until she died. StopFake’s initial report focused on the lack of verification for this story (there were no photographs, despite the huge crowd allegedly gathered at gunpoint to watch), or its incommensurability with independent media reports showing a warm welcome for Ukrainian troops, and on its similarity to archetypal narratives including a recent plotline on the television show Game of Thrones.9 Western journalists likewise covered this report as a fake, making it notorious as a distillation of the ludicrous excesses of Russian state news reporting. However, StopFake later challenged widely circulated Ukrainian claims that Pyshniak was an actress who had been identified playing different roles in Russian television reports on other cities. Framing its activities as neutral fact checking, rather than nationalistic counter propaganda, meant that StopFake was committed to correcting erroneous information in Ukrainian news sources as well as Russian ones.10 As one StopFake volunteer explained: We refuted quite a few cases of erroneous reporting in the Ukrainian news outlets. We find the proof that the claim is wrong and contact the source … Fakes stand out, because usually they are propagated very quickly in multiple outlets and producers of the fake ignore our calls to recall the claim. Such instances were rare, but our informants pointed to them with pride as proof of their willingness to put the service of truth over their immediate desire to bolster Ukraine’s legitimacy in its struggle. This also increased their international credibility: Graves (2016a) noted that StopFake.org’s founders were questioned with regard to their bias at the Global Fact Checking Summit in London, in July 2015, and asked whether they ever debunk falsifications of the Ukrainian side. We found that 9 percent of StopFake’s published reports challenged Ukrainian reports, though most of these involved Russian-generated fake news that spread via Ukrainian media. Changes at StopFake StopFake’s editorial standards and most frequently used kinds of evidence shifted over time. As Figure 3 shows, the proportion of fake stories debunked by finding the original source of misidentified images decreased over time, from 43 percent in the first threemonth period analyzed to 30 percent in the final period analyzed. In contrast, the proportion of StopFake reports relying on finding the original source of quotations to show that they were misleadingly edited or contextualized rose from 16 to 60 percent over the same period. This was more labor intensive, and our StopFake informants suggested that it reflected increasing sophistication in the fake news they encountered. Early StopFake stories frequently relied on assurances from Ukrainian government agencies as evidence, but StopFake’s insistence on “facts” soon led it to discount these sources. We found statements from Ukrainian government officials referenced as evidence in 33 percent of StopFake reports during its first three-month period, dropping dramatically to 10 percent in the next three-month period as the group imposed tighter editorial standards. During our period of observation, StopFake frequently had to adapt its tactics in response to shifts in fake news production techniques. Some of the stories investigated 2078 MARIA HAIGH ET AL. were hard to prove because they reported on fictitious people (e.g., the claim that a nonexistent freelance journalist had been killed in Ukraine)11 or because they relied on the testimony of actors using the names of real experts. StopFake’s success in spreading its stories via social media was challenged by a new practice applied against it as a counter-counter propaganda measure. Facebook provides a “Report Post” button on shared material, so that users can flag items that breach site guidelines by causing offense or being spam. By reporting StopFake stories as offensive, the group’s enemies succeeded in having Facebook suspend the accounts of some people sharing them (Volchek 2015). Toward the end of our period of study we observed some significant institutional changes in the project, which have continued since then. Reliance on volunteer editors began to change in June 2015 when the project received its first grant support, for $38,000, from the National Endowment for Democracy (NED), established by the US congress in 1983. In 2015, NED funded 67 grants in Ukraine, totaling around $3.4 million, to promote investigative journalism, media monitoring, government accountability, anti-corruption initiatives, and training and skill development for leaders and activists. This grant was to “maintain and expand the fact checking website Stopfake.org, transforming it into an information hub for journalists, bloggers and the general public … improve the FIGURE 3 Change over time in the six most frequently used methods of classifying stories as fake. Horizontal axis shows time in months, from May 2014 to August 2015 STOPPING FAKE NEWS 2079 site’s security, increase its outreach on social media, and produce video content for traditional and online TV audiences” (NED 2017). Prior to this, StopFake had few resources and hence almost no institutional existence. Its volunteers had relied on assistance from the Kyiv-Mohyla School of Journalism and its Dean, including the video equipment and studio space used to create weekly news digest-style videos summarizing the most prominent fakes of the week. During the second half of 2015, two journalists were hired as staff members for StopFake, reducing its reliance on volunteer editors. Beginning in early October 2015, the newly established Ukraine Today, intended as a Ukrainian equivalent to English-language services such as France 24 or Deutsches Welle, carried StopFake video digests (UkraineToday 2015). The transition from volunteer-only to staff-managed operations is a vital one in the development of a successful non-profit organization, and it remains to be seen whether these institutional changes will lay the groundwork for long-term success. Conclusions and Implications The distinctive blend of media practices devised by StopFake turned media literacy and responsible journalism into tools of resistance against fake news. To embrace those principles is not to march in lockstep with Ukraine’s new government, which has been at best inconsistent in transcending the post-Soviet pattern of self-interested rule by corrupt elites, but it is most certainly to set oneself in opposition to contemporary Russian state-affiliated media practices. “Fake News,” a phrase that originally struck us as the awkward coinage of people speaking English as a second language, became suddenly familiar to American audiences following the unexpected victory of Donald Trump in the presidential election of November 2016. This followed what US intelligence agencies have officially concluded had been a campaign waged by Russian intelligence on the orders of Putin to tip the election in his favor (Office of the Director of National Intelligence 2017). One of the report’s three key judgements was that Moscow’s influence campaign followed a Russian messaging strategy that blends covert intelligence operations—such as cyber activity—with overt efforts by Russian Government agencies, state-funded media, third-party intermediaries, and paid social media users or “trolls.” (Office of the Director of National Intelligence 2017) The report focused particularly on the strategic role of inaccurate reporting in Russia Today. However, the presidential campaign also demonstrated that fake news is not the exclusive product of state-sponsored trolls. One report, in the New York Times, documented the process by which a major online news story, with hundreds of thousands of social media shares, ballooned from a tweet in which an obscure American with only 40 Twitter followers of his own incorrectly alleged that a fleet of busses visible in a picture had been used to carry anti-Trump protesters to Austin, Texas. Bloggers made no effort to conduct basic checks, such as calling the bus company, before running stories about paid protestors. Efforts by the original poster to correct his mistake, including a new tweet imposing the word “FALSE” over an image of his original message, failed to reach anything like the same audience (Maheshwari 2016). Other reports suggested that the partisan credulity of some American conservatives made pro-Trump propaganda a profitable global industry: a popular fringe political site, departed.co, turned out to have been set up 2080 MARIA HAIGH ET AL. by Beqa Latsabidze, a post-Soviet entrepreneur in Tbilisi, Georgia. He had initially set up websites and social media pages to pander to the supporters of various candidates. It was, he claimed, a surge in advertising revenue as Trump supporters shared his material via social media, rather than any ideological mission, that led him and several of his countrymen to inject vast quantities of pro-Trump fake news into the media feeds of American voters (Higgins, McIntire, and Dance 2016). The activities of StopFake show the power, and the constraints, of journalistic activism against a well-organized fake news campaign. The same internet tools and social media networks that made it easy for Russian activists and trolls to spread peer-to-peer propaganda and disrupt discussions on Western websites also empowered the volunteers of StopFake to build a collaborative community online. The success of StopFake in disseminating counter narratives relied on social media, such as Facebook and Twitter, to spread its work and bring it to the attention of Western journalists who might themselves lack the time, language skills, or specialist knowledge needed to definitively discount fake news before filing their stories. In turn, this generated coverage in conventional media to magnify its impact. Our goal here has been to document the work practices of StopFake rather than to assess their effectiveness. Further research would be needed to judge how successful it has been in countering peer-to-peer propaganda and how applicable its model might be in other contexts. It is not clear that this effort or any other yet mounted have been truly effective in sweeping back fake news. Despite its success in attracting a Russian readership, StopFake clearly could not match the reach of government-controlled and government-allied media outlets. Disrupting the spread of fake news is inherently more resource-intensive than creating it and clinical rebuttals are less outrageous, and hence less likely to spread virally online, than shocking claims engineered without concern for facts. StopFake’s experiences provide an instructive case for American journalists facing their own crisis of relevance in the face of fake news. While mainstream Ukrainian media has a weak tradition of independence and limited reach, the United States has an exceptionally rich media ecosystem of fact checkers and professional journalists. Fact checkers played an active role in disputing the many unsubstantiated claims made by the Trump campaign and, following his victory, the Trump Administration. Journalists trained to avoid taking sides in a dispute were forced to question the reflexive association of neutrality with objectivity. Trump’s blatant lack of concern for facts pushed some traditional media outlets such as the New York Times to shift fact checking from a separate activity into the body of the story, and in some cases into headlines such as “Meeting with Top Lawmakers, Trump Repeats an Election Lie” (Barry 2017). Yet these media responses did not seem effective during the campaign in changing the public’s level of belief in various bogus claims on topics such as crime rates, illegal immigration, and voter fraud. This has been widely attributed to the rise of partisan media (Mitchell et al. 2014). Traditional fact checking rests on the assumption that the public trusts journalists to evaluate claims impartially. Partisan polarization means that many Americans put little faith in government statistics, journalists, or experts to determine what is true and what is fake. A core narrative of conservative media such as Fox News and Brietbart.com has been that readers should not trust “mainstream” journalism. Instead of changing their opinions in response to facts, voters could expose themselves only to facts that fit their opinions. STOPPING FAKE NEWS 2081 The rampant spread of fake news and fake news outlets took this phenomenon to new heights in 2016, letting Americans who so desired submerge themselves fully in a media landscape with little connection to traditional journalistic practices (Beck 2017). As we have stressed throughout this paper, the American model of fact checking is not well equipped to deal with fake news. Instead, it targets exaggeration by politicians, and assumes the trustworthiness of the media that reports those claims and the power of expert opinion and government statistics in rebutting them. Thus, the avalanche of fake news reporting the Clinton campaign’s ties to Satanic child sex abuse was ignored by mainstream media until a man walked into a popular Washington, DC pizzeria and opened fire to free the children he believed were being held in its non-existent basement. “The intel on this wasn’t 100 percent,” he explained from jail (Goldman 2016). Immediately after the election, conservative groups began to appropriate the term “fake news” and apply it to mainstream media (Oremus 2016). Donald Trump has repeatedly dismissed the centrist news channel CNN as “fake news” and refused to take questions from its reporters. Within days of its new prominence in American political discussion, the phrase “fake news” was already at risk of becoming just another partisan insult. Liberals call Breitbart.com fake news, while Trump calls the BBC fake news. Even the Russia Foreign Ministry has embraced the term, setting up a website in which stories from sources such as Bloomberg and the New York Times are depicted with a big red “FAKE” stamp on them, imitating the visuals of StopFake but not its methodical presentation of evidence (Kottasova 2017). If nothing else, our exploration of the work practices of StopFake demonstrates that “fake news” can, when combined with a careful editorial process grounded in media literacy techniques, be reclaimed as a useful and epistemologically robust category. All knowledge is socially constructed, but not all social processes produce the same kinds of truth claims. Fake news, as operationalized by StopFake, falls well outside the normal range of variation caused by journalistic bias and subjectivity. Journalists and scholars need to treat both sides in political or military conflicts fairly to do their jobs effectively, but neither can or should aspire to neutrality in the battle of fake news against real journalistic practice. ACKNOWLEDGEMENTS We would like to thank our informants within StopFake, the anonymous reviewers, Lucas Graves for his comments on an earlier version of this article, Christine Evans for her insights into Soviet media, and Dean Tomas Lipinski for support given to this research by the University of Wisconsin-Milwaukee School of Information Studies. DISCLOSURE STATEMENT No potential conflict of interest was reported by the authors. NOTES 1. Russian officials deny the existence of these trolls and investigators have been unable to trace the ownership of the front companies that employ them (Chen 2015). To distinguish between state and elite private interests is challenging in modern Russia, which was succinctly summarized within titles of recent books as a “kleptocracy” 2082 MARIA HAIGH ET AL. (Dawisha 2014) or “mafia state” (Harding 2011) in which only enterprises allied with the ruling elite are allowed to stay in business, while government officials earn huge sums as executives of nominally private businesses. Thus, the question of whether state accounts ultimately fund the trolls is less important than the observed reality that their messaging is closely coordinated with that of state-controlled broadcast and print media. 2. Agency Record, October 13, 2012 (http://www.tribunahoje.com/noticia/42601/brasil/ 2012/10/13/skinheads-fas-de-hitler-sao-detidos-apos-briga-em-sp.html). 3. See http://www.stopfake.org/en/fake-photo-used-to-depict-almaty-scuffle/. 4. See http://www.stopfake.org/en/fake-bukovinian-romanians-demand-autonomy/. 5. See http://www.stopfake.org/en/how-to-identity-a-fake/. 6. See http://www.stopfake.org/en/fake-ukrainian-fighting-vehicle-entering-donetsk-with- a-swastika/. 7. See http://www.stopfake.org/en/fake-jewish-people-are-leaving-kiev-because-of-the- anti-semitism-of-the-new-government/. 8. See http://www.stopfake.org/en/fake-american-missiles-found-in-luhansk/. 9. See http://www.stopfake.org/en/lies-crucifixion-on-channel-one/ and http://www. stopfake.org/en/the-crucifixion-of-a-3-year-old-the-u-s-helped-kiev-shoot-down-flight- 17-and-other-tales-the-kremlin-media-tell/. 10. See http://www.stopfake.org/en/fake-the-infamous-heroine-of-the-slaviansk-boy-s- crucifixion-report-found-among-the-victims-of-the-explosion-in-donetsk/. 11. See http://www.stopfake.org/en/fake-ukrainian-media-reports-on-starvation-in-russia/. REFERENCES Allan, Stuart. 2006. Online News: Journalism and the Internet. New York: Open University Press. AP. 2013. “Kremlin Helps Media Moguls Expand.” The Moscow Times, October 21. http://www. themoscowtimes.com/business/article/kremlin-helps-media-moguls-expand/488181. html. Barghoorn, Frederick. 1964. Soviet Foreign Propaganda. Princeton: Princeton University Press. Barry, Dan. 2017. “In a Swirl of ‘Untruths’ and ‘Falsehoods,’ Calling a Lie a Lie.” New York Times, January 25. BBC. 2015. Syria Crisis: Russian Airstrikes against Assad Enemies. BBC.com. September 30. http:// www.bbc.com/news/world-middle-east-34399164. Beck, Julie. 2017. “This Article Won’t Change Your Mind.” The Atlantic, March 13. https://www. theatlantic.com/science/archive/2017/03/this-article-wont-change-your-mind/519093/. Bloor, David. 1991. Knowledge and Social Imagery. Chicago: University of Chicago Press. Boczkowski, Pablo. 2010. News at Work: Imitation in an Age of Information Abundance. Chicago: University of Chicago Press. Bonch-Osmolovskaya, Tatiana. 2015. “Combating the Russian State Propaganda Machine: Strategies of Information Resistance.” Journal of Soviet and Post-Soviet Politics and Society 1 (1): 130–158. Boyd-Barrett, Oliver. 2015. “Ukraine, Mainstream Media and Conflict Propaganda.” Journalism Studies 1–19. doi:10.1080/1461670X.2015.1099461. Capron, Alexandre. 2014. “The Ukrainian Female Suicide Bomber who Never Was.” France 24, December 8. http://observers.france24.com/content/20141208-ukrainian- suicide-bomber-girl-fake?page=64. STOPPING FAKE NEWS 2083 Chen, Adrian. 2015. “The Agency.” New York Times, June 2. http://www.nytimes.com/2015/06/07/ magazine/the-agency.html. Chimbelu, Chiponda. 2014. “Fake News Can Ruin Lives, Says Stopfake.org founder.” Deutsche Welle, June 5. http://www.dw.de/fake-news-can-ruin-lives-says-stopfakeorg-founder/a- 17684358. Cohen, Nick. 2014. “Russia Today: Why Western Cynics Lap Up Putin’s TV Poison.” The Guardian, November 8. http://www.theguardian.com/commentisfree/2014/nov/08/russia-today- western-cynics-lap-up-putins-tv-poison. Cottiero, Christina, Katherine Kucharski, Eugenia Olimpieva, and Robert W. Orttung. 2015. “War of Words: the Impact of Russian State Television on the Russian Internet.” Nationalities Papers 43 (4): 533–555. Cresswell, John W. 1998. Qualitative Inquiry and Research Design: Choosing among Five Traditions. Thousand Oaks, CA: Sage. Dawisha, Karen. 2014. Putin’s Kleptocracy: Who Owns Russia? New York: Simon & Schuster. Ellul, Jacques. 1973. Propaganda: The Formation of Men’s Attitudes. Translated by Konrad Kellen and Joan Lerner. New York: Random House. Feyerabend, Paul. 1975. Against Method: Outline of an Anarchistic Theory of Knowledge. London: NLB Atlantic Highlands Humanities Press. Fredheim, Rolf. 2015. “Filtering Foreign Media Content: How Russian News Agencies Repurpose Western News Reporting.” Journal of Soviet and Post-Soviet Politics and Society 1 (1): 36–79. Gans, Herbert J. 2004. Deciding What’s News: A Study of CBS Evening News, NBC Nightly News, Newsweek, and Time. 25th Anniversary Edition. Evanston: Northwestern University Press. Gaufman, Elizaveta. 2015. “Memory, Media, and Securitization: Russian Media Framing of the Ukrainian, Crisis.” Journal of Soviet and Post-Soviet Politics and Society 1 (1): 108–129. Goldman, Adam. 2016. “The Comet Ping Pong Gunman Answers Our Reporter’s Questions.” The New York Times. http://www.nytimes.com/2016/12/07/us/edgar-welch-comet-pizza-fake- news.html. Graves, Lucas. 2016a. “Boundaries Not Drawn.” Journalism Studies 1–19. doi: 10.1080/1461670X. 2016.1196602. Graves, Lucas. 2016b. Deciding What’s True: The Rise of Political Fact-Checking in American Journalism. New York: Columbia University Press. Graves, Lucas. 2016c. “Anatomy of a Fact Check: Objective Practice and the Contested Epistemology of Face Checking.” Communication, Culture & Critique. doi:10.1111/cccr.12163. Graves, Lucas, and Magda Konieczna. 2015. “Sharing the News: Journalistic Collaboration as Field Repair.” International Journal of Communication 9: 1966–1984. Hall, Alex. 1976. “The War of Words: Anti-Socialist Offensives and Counter-Propaganda in Wilhelmine Germany 1890–1914.” Journal of Contemporary History 11 (2/3): 11–42. Harding, Luke. 2011. Mafia State: Spies, Surveillance, and Russia’s Secret Wars. London: Guardian Books. Haynes, David D. 2015. “Fearless Fact-Checkers Keep Tabs on Vladimir Putin’s Propaganda.” Milwaukee Journal-Sentinel, April 2. http://www.jsonline.com/news/opinion/fearless- journalists-check-the-facts-in-vladimir-putins-propaganda-b99474046z1-298511031.html. Herf, Jeffrey. 2006. The Jewish Enemy: Nazi Propaganda During World War II and the Holocaust. Cambridge: Belknap Press of Harvard University Press. Herman, Edward, and Noam Chomsky. 1988. Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon Books. 2084 MARIA HAIGH ET AL. Higgins, Andrew, Mike McIntire, and Gabriel J. X. Dance. 2016. “Inside a Fake News Sausage Factory: ‘This is All About Income’.” New York Times, November 25. Hoskins, Andrew, and Ben McLouglin. 2010. War and Media. Malden, MA: Polity. Isachenkov, Vladimir. 2017. “Russia Announces New Branch of Military to Focus on Information Warfare Amid Hacking Allegations.” The Independent, February 22. http://www. independent.co.uk/news/world/europe/russia-military-information-warfare-hacking- allegations-a7594456.html. Kareklas, Ioannis, Darrel D. Muehling, and T. J. Weber. 2015. “Reexamining Health Messages in the Digital Age: A Fresh Look at Source Credibility Effects.” Journal of Advertising 44 (2): 88–104. Kenez, Peter. 1985. The Birth of the Propaganda State: Soviet Methods of Mass Mobilization, 1917– 1929. Cambridge: Cambridge University Press. Khaldarova, Irina, and Mervi. Pantti. 2016. “Fake News.” Journalism Practice 10 (7): 891–901. Kottasova, Ivana. 2017. “Russia accuses Western media of spreading ‘fake news.’” CNN.com, February 22. http://money.cnn.com/2017/02/22/media/russia-government-fake-news/. Latour, Bruno. 1988. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge, MA: Harvard University Press. Latour, Bruno, and Steve Woolgar. 1979. Laboratory Life: The Construction of Scientific Facts. Princeton: Princeton University Press. Maheshwari, Sapna. 2016. “How Fake News Goes Viral: A Case Study.” New York Times. http:// www.nytimes.com/2016/11/20/business/media/how-fake-news-spreads.html. Markham, Annette N. 2003. “Critical Junctures and Ethical Choices in Internet Ethnography.” In Applied Ethics in Internet Research, edited by M. Thorseth, 51–66. Trondheim: NTNU University Press. McChesney, Robert W. 2012. “Farewell to Journalism? Time for a Rethinking.” Journalism Studies 13 (5–6): 682–694. MediaLiteracy. 2015. MediaLiteracy for Citizens: Training Project Announcement. http://www. medialiteracy.org.ua/index.php/17-novyny/450-proekt-prohrama-mediahramotnosti-dlia- hromadian-oholoshuie-konkurs-na-uchast-v-treninhu-dlia-pidhotovky-treneriv-v-14- oblastiakh-ukrainy.html. Mejias, Ulises A., and Nikolai E. Vokuev. 2017. “Disinformation and the Media: The Case of Russia and Ukraine.” Media, Culture & Society. doi:10.1177/0163443716686672. Mickiewicz, Ellen. 2008. Television, Power, and the Public in Russia. New York: Cambridge University Press. Mikkelson, Barbara. 2013. The Origin of AIDS. Snopes.com. Mitchell, Amy, et al. 2014. “Political Polarization and Media Habits.” Pew Research Center. http:// www.journalism.org/2014/10/21/political-polarization-media-habits/. Mitrokhin, Nikolay. 2015. “Infiltration, Instruction, Invasion: Russia’s War in the Donbass.” Journal of Soviet and Post-Soviet Politics and Society 1 (1): 219–249. National Endowment for Democracy. 2017. Ukraine 2015. http://www.ned.org/region/central- and-eastern-europe/ukraine-2015/. Nemtsov, Boris. 2015. Purin.War. An Independent Expert Report. Moscow: Free Russia Foundation. Nygren, Gunnar, Michel Glowacki, Jöran Hök, Ilya Kiria, Dariya Orlova, and Daria Taradai. 2016. “Journalism in the Crossfire: Media Coverage of the war in Ukraine 2014.” Journalism Studies. doi:10.1080/1461670X.2016.1251332. Office of the Director of National Intelligence. 2017. Assessing Russian Activities and Intentions in Recent U.S. Elections. https://www.dni.gov/files/documents/ICA_2017_01.pdf. STOPPING FAKE NEWS 2085 Oremus, Will. 2016. “Stop Calling Everything ‘Fake News.’” Slate.com. http://www.slate.com/ articles/technology/technology/2016/12/stop_calling_everything_fake_news.html. Petrov, Nikolay, Marina Lipman, and Henry E. Hale. 2014. “Three Dilemmas of Hybrid Regime Governance: Russia from Putin to Putin.” Post-Soviet Affairs 30 (1): 1–26. Phillips, Angela. 2010. “Transparency and the New Ethics of Journalism.” Journalism Practice 4 (3): 373–382. Pipes, Richard. 1995. Russia Under the Bolshevik Regime. New York: Random House. Pomerantsev, Peter. 2013. Russia: A Postmodern Dictatorship? Global Transitions, Lecture Series. Retrieved from http://www.li.com/docs/default-source/publications/pomeransev1_ russia_imr_web_final.pdf?sfvrsn=4. Pomerantsev, Peter. 2014. “Russia and the Menace of Unreality: How Vladimir Putin is Revolutionizing Information Warfare.” The Atlantic. http://www.theatlantic.com/international/ archive/2014/09/russia-putin-revolutionizing-information-warfare/379880/. Pomerantsev, Peter. 2015. “Inside the Kremlin’s Hall of Mirrors.” The Guardian. http://www. theguardian.com/news/2015/apr/09/kremlin-hall-of-mirrors-military-information- psychology. Popper, Karl. 2013. The Open Society and its Enemies. Princeton: Princeton University Press. Reese, Stephen D., Lou Rutigliano, Kideuk Hyun, and Jaekwan Jeong. 2007. “Mapping the Blogosphere: Professional and Citizen-Based Media in the Global News Arena.” Journalism 8 (3): 254–280. Risso, Linda. 2007. “‘Enlightening Public Opinion’: A Study of NATO’s Information Policies between 1949 and 1959 based on Recently Declassified Documents.” Cold War History 7 (1): 45–74. Roudakova, Natalia. 2017. Losing Pravda: Ethics and the Press in Post-Truth Russia. New York: Cambridge University Press. RT.com. 2014. Putin Acknowledges Russian Military Serviceman Were in Crimea. RT.com. April 17. https://www.rt.com/news/crimea-defense-russian-soldiers-108/. RT.com. 2015. Putin: Ukraine Army is NATO Legion Aimed at Restraining Russia. RT.com. January 26. http://rt.com/news/226319-putin-nato-russia-ukraine/. Schudson, Michael. 2001. “The Objectivity Norm in American Journalism.” Journalism 2 (2): 149– 170. Schudson, Michael. 2003. The Sociology of News. New York: W. W. Norton & Company. Scott, Mark. 2014. Mail.ru Takes Full Ownership of VKontakte, Russia’s Largest Social Network. New York Times. September 16. http://dealbook.nytimes.com/2014/09/16/mail-ru-takes- full-ownership-of-vkontakte-russias-largest-social-network/?_r=1. Seddon, Max. 2014. Documents Show How Russia’s Troll Army Hit America. BuzzFeedNews. June 2. http://www.buzzfeed.com/maxseddon/documents-show-how-russias-troll-army-hit- america#.pueWADg7m. Shapin, Steven. 1994. A Social History of Truth: Civility and Science in Seventeenth-Century England. Chicago: Chicago University Press. Shevchenko, Vitaly. 2014. “Little Green Men” or “Russian Invaders"? March 11, 2014, http://www. bbc.com/news/world-europe-26532154. Snyder, Timothy. 2014. Ukrainian Crisis is Not about Ukraine, It’s about Europe. StopFake.org. November 17. http://www.stopfake.org/en/historian-timothy-snyder-ukrainian-crisis-is- not-about-ukraine-it-s-about-europe/. Society of Professional Journalists. 2014. Society of Professional Journalists Code of Ethics. http:// www.spj.org/ethicscode.asp. 2086 MARIA HAIGH ET AL. Sputnik International. 2015. “About Us.” Sputniknews.com. http://sputniknews.com/docs/about/. Stanford History Education Group. 2016. Evaluating Information: The Cornerstone of Civic Online Reasoning (Executive Summary). https://sheg.stanford.edu/upload/V3LessonPlans/ Executive%20Summary%2011.21.16.pdf. Stencel, Mark. 2016. Global Fact-Checking up 50% in Past Year. Duke Reporters’ Lab. Szostek, Joanna. 2014. “Russia and the News Media in Ukraine: A Case of “Soft Power”?” East European Politics and Societies and Cultures 28 (3): 463–486. Thomas, Timothy. 2014. “Russia’s Information Warfare Strategy: Can the Nation Cope in Future Conflicts?” The Journal of Slavic Military Studies 27 (March): 101–130. Toler, Aric. 2015. What you Need to Know about Russian Social Networks to Conduct OpenSource Research. Globalvoices.org. October 21. https://globalvoices.org/2015/10/21/ what-you-need-to-know-about-russian-social-networks-to-conduct-open-source- research/. Tomkiw, Lydia. 2014. A Ukrainian Fact-Checking Site is Trying to Spot Fake Photos in Social Media —And Building Audience. NiemanLab. June 2. http://www.niemanlab.org/2014/06/a- ukrainian-factchecking-site-is-trying-to-spot-fake-photos-in-social-media-and-building- audience/. UkraineToday (Producer). 2015. Ukraine Today and Project “Stopfake” Join Efforts to Expose False Media Reports. Ukraine Today. October 24. http://uatoday.tv/society/ukraine-today-and- project-stopfake-join-efforts-to-expose-false-media-reports-514123.html. Van der Schueren, Yannick. 2015. Les médias dans l’arsenal du Kremlin. Tribune de Genève. February 4. http://www.tdg.ch/monde/europe/medias-arsenal-kremlin/story/27564217. Volchek, Dmitry. 2015. Activists Ask Facebook for Protection against Pro-Kremlin Attacks. Radio Free Europe/Radio Liberty. http://www.rferl.org/content/activists-facebook-for-protection- against-pro-kremlin-attacks/27075427.html. Wolfe, Thomas C. 2005. Governing Soviet Journalism: The Press and the Socialist Person After Stalin. Bloomington: Indiana University Press. Zavadski, Katie. 2015. Putin’s Propaganda TV Lies about its Popularity. The Daily Beast. September 17. http://www.thedailybeast.com/articles/2015/09/17/putin-s-propaganda-tv-lies-about- ratings.html. Maria Haigh (author to whom correspondence should be addressed), School of Information Studies, University of Wisconsin-Milwaukee, USA and School of Information and Media, Siegen University, Germany. E-mail: mhaigh@uwm.edu Thomas Haigh, History Department, University of Wisconsin-Milwaukee, USA and School of Information and Media, Siegen University, Germany. E-mail: thomas.haigh@gmail. com Nadine I. Kozak, School of Information Studies, University of Wisconsin-Milwaukee, USA. E-mail: kozakn@uwm.edu STOPPING FAKE NEWS 2087 Copyright of Journalism Studies is the property of Routledge and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.