CHAPTER 1 Explaining the Challenge: From Persuasion to Relativisation Miloš Gregor and Petra Mlejnková 1.1 Introduction A chapter describing the history of phenomena such as propaganda and disinformation could begin with the obligatory statement ‘even the ancient Greeks’ or ‘the problem is as old as mankind itself’. Moreover, it would be correct. The truth is that information has always been crucial in politics and warfare, and those who possessed it and could manipulate it were able to confuse the enemy, the adversary, or the public. As O’Shaughnessy states, there was no golden age of truth (O’Shaughnessy 2020, 55). Information and its manipulation, therefore, have always been present in our lives. M. Gregor · P. Mlejnková (B) Department of Political Science, Faculty of Social Studies, Masaryk University, Brno, Czechia e-mail: mlejnkova@fss.muni.cz M. Gregor e-mail: mgregor@fss.muni.cz © The Author(s), under exclusive license to Springer Nature Switzerland AG 2021 M. Gregor and P. Mlejnková (eds.), Challenging Online Propaganda and Disinformation in the 21st Century, Political Campaigning and Communication, https://doi.org/10.1007/978-3-030-58624-9_1 3 4 M. GREGOR AND P. MLEJNKOVÁ Propaganda and disinformation have changed along with every new technology used for information and intelligence. With the development of the media and information environment, the possibilities of propaganda and disinformation have also changed (see Chapter 2). Rarely—or slowly—however do the goals of propaganda change; they are by and large an effort to influence, manipulate, and persuade a target audience. The tools that propaganda deploys have undergone much more dynamic development—from word of mouth, through leaflets and posters, to modern communication technologies. The latest significant shift has been the advent of the Internet and social media in particular. While it may have seemed that propaganda was in hibernation after two world wars and the ensuing Cold War, the events of recent years have refuted this. At the beginning of the twenty-first century, propaganda was mainly associated with nondemocratic regimes geographically far from the Western (democratic) world, as North Korea or China are. Democratic regimes felt far from attempts at manipulation, far from threats of exposure to external political influence through disinformation campaigns, and far from attempts at meddling in the internal affairs of sovereign regimes and their societies. Exceptions and controversial cases can be spotted, and various conflicts can serve as a good example; however, these strategies and techniques were far removed from the values represented by Western liberal democracies. Yet, no later than 2014, exactly ten years after Facebook came into being, the Western world began to struggle with massive propaganda and disinformation campaigns targeting societies across the world. Since that time, we have been able to trace with extreme precision the connection between propaganda and disinformation campaigns and the conflict in Ukraine. Nevertheless, it is not only state or statesponsored actors actively running such campaigns, but it is also extremist and terrorist organisations building, in a very professional way, propaganda so as to threaten or radicalise and recruit individuals, as Daesh has. Since then, the whole issue of propaganda and disinformation has been faced with a fate similar to many other scholarly terms and concepts which have become more widely used by politicians, the media, and the general public, thus misleading understanding and the use of individual terms in the wrong way or the wrong context. The clear example is the modern and catchy phrase fake news, which is now used by some politicians as a label for media asking unpleasant questions or drawing attention to the past scandals of politicians. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 5 The purpose of this chapter, therefore, is to describe the concepts of propaganda and disinformation, their development, and how they relate to each other. Other related concepts, such as fake news, misinformation, disinformation campaigns, psychological and information operations, and influence operations will be introduced and put into the context of the aim of this edited volume. In the last part of the chapter, changes in society, media, and the information environment will be introduced: what the Internet and social media especially have brought to propaganda and disinformation, and how both the understanding and functional level of propaganda and disinformation have changed in the so-called posttruth era. Last but not least, the dynamic development connected to the COVID-19 pandemic will be mentioned. It is no surprise that this emotionally tense period has brought about a new level of disinformation and public distrust. 1.2 Propaganda One of the most problematic aspects of propaganda is its definition of anchoring. A century ago, Lasswell defined propaganda as the management of collective attitudes through the manipulation of significant symbols (Lasswell 1927, 627). According to Ellul, propaganda is a process aiming to provoke action, to make the individual cling irrationally to a process of action. It does not lead to a choice, but to a loosening of the reflexes—to an arousal of an active and mythical belief (Ellul 1973, 25). The minimalistic definition says that propaganda is a deliberate attempt to persuade people to think and behave in a desired way (Taylor 2003, 6). It encompasses conscious, methodological, and planned decisions to employ techniques of persuasion designed to achieve specific goals intended to benefit those organising the process. The methods employed vary according to the communications channels available— speeches, songs, art, radio, television, or the Internet. Propaganda uses communication to convey a message, an idea, or an ideology which is designed primarily to serve the self-interests of the person or people doing the communicating. An essential component of propaganda is information in the form of a lie or at least not the whole truth. This information is designed to serve the primary interests of the propagandist, invoking a common pattern of one-way, manipulative communication that solicits the action—or inaction—of the masses (Baines et al. 2020, xxv). Concealment of inconvenient facts and censorship are inseparable counterparts to propaganda (Taylor 2003, 14). 6 M. GREGOR AND P. MLEJNKOVÁ The understanding of propaganda differs across many historical, cultural, ideological, and political contexts. The first use of the notion of propaganda dates back to 1622 and the establishment of the Sacred Congregation for the Propagation of the Faith (originally Congregatio de Propaganda Fide in Latin), through which Roman Catholic cardinals managed their missionary activities (Guilday 1921). In the twentieth century, political regimes began to use the term more actively. Dictatorial regimes have never fought shy of the word ‘propaganda’ when self-labelling. During World War II—and even before—Nazi Germany had its Ministry of Popular Enlightenment and Propaganda, and the Soviet Union had its Propaganda Committee of the Communist Party. Marxism-Leninism referred to propaganda as the rational use of historical and scientific arguments to indoctrinate the educated and enlightened people. In both countries, propaganda was perceived positively and as an inevitable element leading to political victory (Smith 2019). Similar examples could be found in China (Central Propaganda Department) or Vietnam (Central Propaganda Department of the Communist Party of Vietnam). In contrast, democratic countries used the word ‘propaganda’ to label the communication activities of their opponent. Their departments avoided self-labelling themselves as propagandistic; the United Kingdom established the Ministry of Information and the United States had the Office of War Information. The aim of propaganda is to shape the people’s worldview, creating desired group, class, and society-wide role models as well as consciousness. It presents the institutionalised dissemination of essentially systematically arranged ideas, theories, opinions, doctrines, or whole ideologies. Propaganda may or may not ask for belief; it usually does not employ a rational appeal. As Baines, O’Shaughnessy, and Snow remind, ‘It does not seek credibility based on the provision of accurate information … the genre is almost exclusively defined by its emotive content and rejection of non-emotive forms of persuasion’ (Baines et al. 2020, xxv). Propaganda is distinguished from other forms of communication by intention and by emphasis on manipulation of the recipient. The deliberate selectivity of information and manipulation of propaganda also distinguishes itself from the educational process (Smith 2019). However, the boundaries between education and propaganda can be perceived differently, both from an information and ideas perspective and from the point of view of the recipient—as was shown in the Soviet Union’s understanding of the term. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 7 All propaganda is inherently built on a trinity of concepts: mythology, symbolism, and rhetoric (O’Shaughnessy 2004). Effective use of these foundational elements differentiates between successful persuasion and unsuccessful efforts to influence masses. Mythology represents the core of any piece of propaganda. Propagandists usually exploit history and tradition as well as national characteristics as a source. A myth is a story, a narrative rather than an ideology, an idea or principle. These myths usually tell a story which clearly identifies the heroes and villains, good and bad behaviour patterns illustrated via publicly known cases. They are not told in sophisticated language; contrariwise, they use easy to recognise and easy to identify symbols—or at least their essence or meaning can be transmitted through a mere sign (O’Shaughnessy 2016, 140). Myth can often be accompanied by fiction, which is to the propagandist something other than a mere lie—it represents a profounder form of reality, an alternative narrative which embodies the sought after ideal state of affairs (O’Shaughnessy 2020, 66). Symbols are essential not solely to mythology but to effective communication and persuasion in general, including propaganda. It springs from the fact that successful mass communication has an inherently emotive nature, and symbols are crucial connections to emotions. Therefore, the presentation of symbols can stand in place of argument. Anything can become politically symbolic if the right context is found (O’Shaughnessy 2016, 216). The last part of the triad is rhetoric. In a classic example, George Orwell represents the strength of language in his masterpiece, 1984. In his novel, the author illustrates—besides other tools in the total totalitarian absorption of power and control of an individual’s life—that control of the people’s mind lies in the control of language (Orwell 1949). Understanding the motives and mechanisms of the group mind is the key to controlling and managing the crowds, and the tool of rhetoric is the correct use of language (Bernays 1947). Rhetoric is built on explicit or implicit framing which provides the recipient with simplistic clues and a clear path to understanding the matter correctly (O’Shaughnessy 2016, 256). Whereas propaganda in the twentieth century could afford an explicit lie because there were only limited ways to verify information, the situation changed with the advent of the Internet. In general, a big task of the propagandist is to gain trust, create credibility, and persuade the audience to trust the source. Today, explicit lies can often be easily verified; our access to many information sources radically reduced the effectiveness of lies and their impact on society—in cases where we are willing to 8 M. GREGOR AND P. MLEJNKOVÁ find the correct information (which is discussed later on in this chapter). Upon learning that part of a message was wrong, credibility suffers. And vice versa, if a fact-checked part of a message is correct, the effect is the opposite: The audience is more likely to believe the whole message and believe the source in the future again. The propagandists’ strategy today is therefore usually not telling lies but rather selecting the truth and mixing it with manipulative content. The final mixture also contains lies, but in that case, who cares about lies all around when it is based on a real story. A different way of processing the truth is also to create an atmosphere of uncertainty in terms of feeding feelings of ‘everybody lies’ and ‘the truth is not relevant anymore’ because ‘nothing is as it seems’. This means propagandists do not convince the audience about one particular ‘truth’. Instead, they use selected manipulative techniques and even lies to raise doubts about the credibility of a target audience’s information sources or induce apathy in them with overwhelming conflictual information. This is a strategy applied by authoritarian regimes. With the development of modern technologies, however, it has become more difficult for these regimes to control the flow of information. Instead, they produce an incredible volume of information, making the information overwhelming to navigate. It produces ‘data smog’, where useful information becomes hard to find (Shenk 1997). The strategy of creating ‘data smog’ or ‘information noise’—flooding the information environment with plenty of alternative and conflicting stories and information, some truthful, some not; some verifiable, some not; relevant or irrelevant—creates the illusion that nothing is as it seems, and it is difficult to search for the truth. In that case, nobody really cares about being caught telling lies, because the propagandist’s primary aim is not to build one’s own credibility or a coherent ideology as much as it is to persuade that the enemies lie. The information is, after all, weakened by misleading ballast and does not have any value anymore. An apathic population is then the best for authoritarian governments to control. The character and possibilities of propaganda are directly influenced by the repertoire of means and tools which can be used for dissemination. During the reign of Pope Gregory XV, when the Sacred Congregation for the Propagation of the Faith was founded, the main instrument of communication was word of mouth. Literature was the prerogative of the narrowest layers of society. By the second half of the nineteenth century, however, the mass printing of newspapers had begun and would become one of the tools in the arsenal of propagandists. A similar shift was seen 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 9 with the advent of cinema, radio, and television among the general public. It is not an accident that the golden era of propaganda is associated with the first half of the twentieth century’s world war period, when most of the abovementioned media underwent a massive boom. A similar boom accompanied the onset of the Internet and social media especially. These are not solely exploited for propaganda purposes—now termed digital propaganda—but also for the repression and oppression of citizens in the form of blackmail, harassment, and coercion of those opposing the regime (Pearce 2015). The possibilities of the current information environment have given people almost unlimited access to information, which makes censorship more difficult. It has also given rise to the new phenomenon of online journalism (Bor 2014) and citizen journalism (Atton 2009; Goode 2009; Kaufhold et al. 2010; Lewis et al. 2010). They represent a big challenge for us because these types of journalism contribute to the increased amount of information—albeit, sometimes of disputable quality and accuracy—in the information environment. For the recipients of news, this democratisation of journalism, as well as the abundance of information and its sources, are ambiguous. They also empower the potential of digital propaganda, which, thanks to these developments, uses the massive peer-to-peer replication of ideas and active participation in the spread of propaganda. It decreases the necessity for centralised structures to maintain or accelerate the propagation of particular ideas. Propagandists may enjoy the advantage of the decentralised structure of social media by orchestrating campaigns with broad impact while the source remains difficult to identify (Farkas and Neumayer 2018; Sparkes-Vian 2018). Using trolls and bots for astroturfing, the simulation of public support, is one of the consequences of the development of social media technologies. This specific aspect is discussed further in Chapter 2. 1.2.1 Manipulation Is the Backbone of Propaganda As was already implied, the concept of manipulation is crucial for propaganda. All propaganda is manipulative, and it would be nonsense to speak of non-manipulative propaganda (O’Shaughnessy 2004, 18). Manipulation can acquire different meanings depending on the sector and context in which it is applied. It acquires the broad and somewhat blurred semantic field of the term, the negative intention of the speaker, and the covert character of influence (Akopova 2013). Manipulation can take 10 M. GREGOR AND P. MLEJNKOVÁ many different forms and can be applied through the use of manipulative techniques. Propaganda manipulation is often concerned with information, information systems, and the networks people use. Propaganda manipulates individuals in terms of processing people’s consciousness as well. Data can be manipulated by changing its meaning or content. This kind of manipulation is the least noticeable and, therefore, the most difficult to detect. There are dozens of manipulative techniques of propaganda and persuasion (see Pratkinis and Arons 2001; Bernays 2004; Shabo 2008). To mention at least some of those most often described by scholars and used by propagandists, we can name the appeal to fear, blaming, demonisation, fabrication, labelling or name-calling, relativisation, and the use of manipulative pictures or videos. The appeal to fear benefits from the fact that emotions are the backbone of propaganda. Thus, fear belongs among the most common emotions exploited by propagandists (Baines et al. 2020). Several underlying fears frequently present in propaganda can be identified: the fear of rejection, powerlessness, and, most significantly, the fear of death (Shabo 2008). The appeal to fear employs audiences’ worry about the unknown or their bad experiences with the target group or principles. These fears are one of the most potent motivations behind people’s behaviour and attitudes. Blaming, as a manipulative technique, pinpoints the enemy responsible for the event or situation. Propagandists often oversimplify complex problems by pointing out a single cause or a single enemy who can be blamed for it (even if not responsible at all). For everything, from unemployment to natural disasters, blaming the enemy can help the propagandist achieve his or her agenda (Shabo 2008). Demonisation is used to dehumanise the opponent. It usually employs similar tools to labelling— though in a more straightforward way. The aim is to picture the opponent as an enemy not just with a different point of view but also not even human. Such a technique is frequently used in armed conflicts in order to remove the fear from soldiers who are meant to kill the adversary’s soldiers (men killing men). When depicting the enemy in a dehumanised way, equating humans with animals, it psychologically makes the situation easier. Fabrications are false information presented as true statements (Syed 2012). They usually take the form of misleading or downright false information presented as verified information. Labelling (or name-calling) is the use of negative words to disparage an enemy or an opposing view. Labelling can take many different forms depending on the circumstances, but they all, rather than making a legitimate argument, 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 11 attack the opposition on a personal level. They also often appeal to the audience’s preconceptions and prejudices (Domatob 1985; Shabo 2008). Relativisation serves to weaken either the opponent’s merits or damage a preferred actor. It is usually used to pacify emotions when (from the propagandist’s point of view) something bad is happening. It explicitly contains criticism of the opponent or trivialises the problem. Manipulative video and pictures represent one of the most apparent manipulative techniques here. In the context of this edited volume, we consider video or picture as manipulative if it shifts audiences’ perception of the subject, or if it presents a collage or somehow modified media. A common denominator of most manipulative techniques is emotions. They represent a crucial part of human existence, and they are an important factor in different processes through which every individual goes. In 2010, neuroscientist Antonio Damasio used his research to confirm a major psychological theory. In decision-making, emotions play a very important role and are perhaps even more decisive than logic. To blame is the part of the brain responsible for the formation of emotions— without it, we would not be able to make even simple decisions (Damasio 2010). The importance of emotions can be further demonstrated by prejudice—our associations awaken different emotions, and our strong social attitudes are emotionally supported. Frequently, manipulative techniques target fear and feelings of uncertainty. The reaction of some political actors to the 2015 European migration crisis was an excellent example of manipulation that fed people’s (natural) fears, for example, creating connections between immigrants and terrorists/criminals. When subjectively feeling unsafe and unstable, an individual is more susceptible to manipulation as the need for personal safety belongs among basic personal needs. Propagandists do not produce deceptive messages through linguistic manipulation with an emotional appeal or accents only, but also by manipulating the information as a whole or the context in which the information is presented. We can distinguish among four primary methods: biasing published information, ambiguously presenting information, manipulation through the amount of information published, and presenting irrelevant information to the discourse (McCornack 1992, 4). With the developments described above, with the shift from persuasion to relativisation, we can observe a shift from the first two manipulation forms to the latter two mentioned: manipulation of the amount of information and presenting irrelevant information to the discourse. 12 M. GREGOR AND P. MLEJNKOVÁ 1.3 Disinformation In the last decade, likely mentioned even more than propaganda was the term disinformation. Disinformation can be perceived as a part of the larger conceptual realm of propaganda. It deploys lies, but not always and not necessarily. Therefore, we can say not all propaganda is disinformation, but all disinformation is propaganda (O’Shaughnessy 2020, 55). The intense debate about the dangers of spreading disinformation and propaganda was revived in 2014 in the context of the Russian Army’s occupation of the eastern part of Ukraine and the annexation of Crimea. However, like propaganda, disinformation is also not a new phenomenon. For decades, the concept of disinformation had been exclusively related to intelligence activities around the world. A classic example of a disinformation campaign in this regard is Operation INFEKTION, a disinformation campaign run by the KGB, the Soviet intelligence service, in the 1980s. The campaign was designed in order to undermine the credibility of the United States and foster anti-Americanism. The campaign planted the idea that the United States had invented the HIV virus in the laboratories of the Department of Defense as a biological weapon. The term disinformation, however, only began to penetrate the media lexicon and the general public starting in the 1980s. In February 1992, David C. Berliner presented a paper dedicated to educational reforms in the ‘era of disinformation’ (Berliner 1992). The aim of disinformation is not necessarily to persuade, to make people change their mind. According to its purpose, we can distinguish among four categories of disinformation (O’Shaughnessy 2020, 58–59): 1. Acquiescence, not belief: The purpose of disinformation is to create, sustain, and amplify divisions within a rival political party, a government, a coalition. In this sense, disinformation is a strategy of political control. 2. Sow division: Disinformation as a national strategic tool with the aim of sabotaging international consensus—a weapon against a hostile nation or coalition. It becomes therefore a method of leveraging advantage in international relations. 3. Sow confusion: The aim is to perplex—everything and nothing is believable anymore, which is the precondition for political paralysis. Lying is a strategy. The object is not to create belief but to spread confusion. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 13 4. Raise doubt: The spread of doubt is a very effective genre of disinformation since credible phenomena can seldom be proven absolutely. There is always the possibility of doubt. Disinformation includes a broad scale of false, fake, inaccurate, or misleading information or messages which are generated, presented, and disseminated with the aim of confusing the recipient of the information or causing public damage. Disinformation is based on three necessary characteristics (Fallis 2015), and we can speak of disinformation only if all features are fulfilled. First of all, disinformation is information, information which represents some part of the world as being a certain way. Although some authors consider information only what is true (Dretske 1983), we do not have to have a philosophical discussion on the meaning of information since we can accept the view that information provides us content that is either true or false (see Chapter 6). The fact is, we perceive information firstly and consider or evaluate it subsequently. And only in some cases, the second characteristic defines disinformation as misleading information, which means it creates false beliefs. Thirdly, disinformation is not accidental in its misleading nature; there is an intentional attempt to provide an audience with information which creates false beliefs. The intention to deliberately disseminate false information is crucial to the characteristic because it differentiates disinformation from other forms of inaccurate messages, such as misinformation. The intention is also reflected in the term disinformation campaign, which refers to the systematic usage of misleading information and the series of acts leading to desired goals. A disinformation campaign is the focused, controlled, and coordinated dissemination of disinformation in order to influence the opponent’s decision-making process or to achieve political, economic, or other advantages (Kragh and Åsberg 2017). It means the systematic and deliberate manipulation of an audience regardless of whether the audience is represented by the general public or a specific segment of the population, such as politicians. 1.3.1 What Disinformation Is and What It Is not In discussions among politicians or in the general public, we can encounter overuse of the term disinformation. In addition to cases which fulfil the characteristics described above (intentionally misleading information), other cases are sometimes referred to as disinformation as well. Not 14 M. GREGOR AND P. MLEJNKOVÁ only do politicians use the term disinformation as labelling, which serves to discredit presented information, an argument, or even the bearer of information, but marking a statement or argument as disinformation has also become a heated point of discussion in and of itself. There is much information which does not carry true content, which can be misleading or even false. These cases, however, do not fulfil the intention to mislead; they do not have this function. An example could be information which used to be true but is not anymore (e.g. which team is leading the National Football League). However, we can distinguish between two main types of disinformation which may have this function. First is the intention to mislead as the goal—that is, they try to change the audience’s mind or behaviour. The second uses the misleading message not as a goal but to benefit the author in a different way, for example, financial gain. The goal of disinformation in such cases is not to change the mind or behaviour of the audience but to gain a benefit resulting from this change. In health care, for instance, disinformation can be used to convince the audience of the harmfulness of drugs in order to make people use products the author wishes them to. Categories falling within the scope of disinformation are, among others, represented by lies, audio-visual messages, or those produced and spread altruistically. It represents information which is not just out of context, presented as partly true, or incomplete but also information presented against objectively verifiable truth. Audio-visual disinformation can take the form of pictures or videos with post-production modifications or presented in a different context (place or time) than that in which it was created. Deepfakes are another example (see Chapter 2). Even altruistic misleading information can be considered disinformation; it makes no difference whether the presentation of misleading information was driven by good intentions. Conversely, barely true information cannot be considered disinformation; otherwise, the meaning of the term would be empty. False information can also be presented without the intention to manipulate; it can be accidental falsehoods. A special category represents jokes and satire, which are usually based on mendacious or incomplete information. Their intention is rather to entertain the audience than to mislead it. However, this does not mean that the abovementioned cases cannot mislead people. Even if they were not created with the purpose of misleading an audience, they can later become disinformation when they are disseminated (Table 1.1). 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 15 Table 1.1 Categories of disinformation Disinformation Not disinformation Malicious lies Truthful statement Audio-visual disinformation Accidental falsehoods True disinformation Jokes Side-effect disinformation Sarcastic comments Adaptive disinformation Accidental truths Altruistic disinformation Implausible lies Detrimental disinformation Satire Source Authors, based on Fallis (2015, 415) 1.3.2 Disinformation Versus Fake News Today, the term fake news has become more common than disinformation after becoming publicly known and widely used during the 2016 US presidential election at the latest. Its popularisation was due to the then presidential candidate Donald Trump, who labelled media critical of him as fake news. As a relatively new concept, it has not been given enough epistemological attention, and its understanding varies across the general public and academia (Gelfert 2018). Some authors consider fake news to be all news which is not factbased but, despite that, is reported as true news (Allcott and Gentzkow 2017). Others perceive fake news as news which denies the principles of quality and objective journalism (Baym 2005). However, there is difference between the media spreading fake news and so-called political media, which adjusts news coverage in such a way that it seeks to establish the political agenda of a related political party or entity (Vargo et al. 2017). Silverman claims that fake news is news which is not based on truth and is made for financial gain—typically through ad revenue. Fake news is, therefore, close to tabloid journalism, but unlike the tabloids, the profit motive, aside from the selling of the medium itself, is crucial in this definition because, according to Silverman, it should be considered propaganda when a financial motivation is missing (Silverman 2016). Fake news is also often used as a catch-all term for all disinformation, but it is a narrower term falling within this classification (Gelfert 2018). Therefore, fake news is a kind of disinformation and can be defined as the deliberate creation and dissemination of false stories and news which are claimed to be serious news by their author. 16 M. GREGOR AND P. MLEJNKOVÁ The concept of fake news in a way responds to the fact that we rather associate disinformation with agents and governments and much of the misleading information presented in the information environment and media with other actors. Today, however, anyone with a political or economic intention can deliberately create disinformation. 1.4 Disinformation and Propaganda as a Tool to Promote Political Outcomes Nobody, perhaps with a few exceptions, uses disinformation and propaganda just for fun. There is always some intention standing behind it, which is also the purpose of disinformation and propaganda and their inherent characteristic. State, state-sponsored, or non-state actors decide to walk the pathway of manipulation usually in order to promote its political, economic, military, or cultural interests and goals, or the actor deems it necessary to protect themselves against adversaries. Mostly, it has been undemocratic regimes and their supporters or extremist and terrorist organisations being mentioned in this context. Disinformation and propaganda give them the opportunity to use manipulative techniques and lies in order to target the hearts and minds of predefined target groups, which is after all believed to be the most important factor in any conflict. Disinformation and propaganda based on its character are usually exercised in a planned and coordinated manner, indicating that any intentionally misleading information does not stay alone for a long time; it mostly fits into a series of actions taken to achieve someone’s desired advantages and goals defined within time and place. This may not appear as misleading information at the time, but it is usually revealed later to be part of a more complex sequence of actions, supporting strategic aims. We can call it an operation, a performed action, including its planning and execution (Merriam-Webster Dictionary n.d.). More specifically, disinformation and propaganda represent major components of psychological operations. The ability to manage and change the perceptions of a targeted audience may be considered the fourth instrument of power available to a political actor next to diplomatic, economic, and military powers (Brazzoli 2007, 223). The use of psychological operations increases the effectiveness of other instruments, and it increases one’s chances in 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 17 conflicts, which, contemporarily, often happens in an asymmetric environment. Psychological operations (psyops) can be defined as pre-planned activities using communication methods and other resources aimed at selected target audiences to influence and shape their emotions, attitudes, behaviours, perceptions, and interpretations of reality. Thus, by using special methods, it is possible to induce desirable responses in the target population, which, in the broader context, contributes to the fulfilment of specific objectives. Every psychological operation is based on a particular psychological theme: the main, carefully prepared narrative or ideas. The higher the target audience’s receptivity—in other words, their sensitivity to specific psyops tools—the higher the probability of the whole psychological operation’s success (NATO Standardization Office 2014, 2018; Vejvodová 2019; Wojnowski 2019). The importance of psyops is based on the belief that the psychological nature of a conflict is as important as the physical (Stilwell 1996). People’s attitudes and behaviours affect the course and outcome of a conflict and the nature of an environment in which a conflict takes place. For a well–conducted psychological operation, it is crucial to know the target audience, its will, and its motivation. Psyops work with these elements and aim to influence them, weakening the adversary’s will, strengthening the target group’s commitment, and gaining the support and cooperation of undecided groups (Vejvodová 2019). Psyops can promote resistance within a population against a regime, or it can put forward the image of a legitimate government. They have the power to demoralise or enliven a population. They can reduce or increase desired emotions among a target population. They can even support apathy, on the one hand, or radicalise, on the other. Brazzoli (2007) distinguished between hard and soft aspects of psyops: He relates the hard aspects to the creation of negative perceptions, for example, of state, government, or society in order to sow the seeds of alienation. The soft aspects relate to positive images motivating the audience to follow a lead. Both aspects have the goal of subverting the mind and influencing unconscious behaviour as decided by those who direct the operation. It is believed that conflicts end when one side has lost the will to continue the conflict and comes to the decision that there is more to be gained or less to be lost by letting the adversary prevail. Therefore, the adversary’s will is, in addition to military and economic tools 18 M. GREGOR AND P. MLEJNKOVÁ or external support, a decisive factor in a conflict. In military conflicts, psychological operations achieve objectives where military force is not available, is inappropriate, or where it can be combined with the military in order to minimise expenditures and maximise effects. Psyops persuade via nonviolent means—psychological weapons of persuasion and deception. In this context, the Chinese strategist Sun Tzu is very often quoted. Sun Tzu advocated for the psychological undermining of the adversary and mentions in his The Art of War that one of the fundamental factors affecting war is that which ‘causes the people to be in harmony with their leaders so that they will accompany them in life and unto death without fear of mortal peril’ (Post n.d.). Dortbudak quotes another piece of his work when stressing that victory is rather determined by which side has influence over the other side’s decisions and actions: ‘To fight and conquer in all our battles is not supreme excellence; supreme excellence consists in breaking the enemy’s resistance without fighting’ (Dortbudak 2008). Psyops can be divided into three types: strategic, operational, and tactical. Strategic psyops advance broad or long-term objectives. They might even be global, or at least focused on large audiences or key decision-makers. Operational psyops are conducted on a smaller scale. Their purpose can range from gaining support for an action to preparing the defined field or environment for an action. Tactical psyops are then even more limited and used to secure immediate and near-term goals. But do not mistake psyops as connected only with the military environments and military actions. Psyops are also conducted in environments short of military conflict or declared war; they belong among the tools of political, economic, and ideological influence. They are conducted continuously to influence perceptions and attitudes in order to effectively target an audience to the benefit of the influence source. In political theory, psyops are studied mostly in the context of nondemocratic regimes, where they are based on propaganda, and where the ultimate goal is to control and manipulate the population, adversaries, or even neutral actors in and outside the regime. The same applies to extremist or terrorist organisations. Modern versions of these organisations put great emphasis on their psyops methods so as to increase fear within their target audience (Ganor 2002). Spreading fear beyond the direct target is an essential aspect of terrorism, and the development of the Internet and 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 19 social media has served as a force-multiplier for terrorists. The emergence of virtual space combined with increasing technological literacy among terrorist organisations has created new opportunities for the use of psyops (Bates and Mooney 2014; Cilluffo and Gergely 1997; Emery et al. 2004). 9/11 is a classic example of terrorist psychological influence. Nevertheless, democratic regimes and actors conduct psyops as well. In comparison to undemocratic ones, however, they are short of propaganda and disinformation, which are believed to be excluded from the democratic toolbox. Today, psychological operations are perceived as a specific part of information operations. With information operations, we are already moving more towards the area of strategic and military thinking. Nevertheless, the term information operation has also been integrated into civil and political communication language, mainly due to the debates over Russian and Chinese activities abroad. Information operations can be defined as activities undertaken to counter hostile information and information systems while protecting their own. It is the coordinated and integrated employment of information-related capabilities to influence, disrupt, corrupt, or usurp decision-making (United States—Joint Chiefs of Staff 2014; Miljkovic and Pešic 2019). They represent offensive and defensive measures focused on influencing an adversary’s decisions, manipulating information and information systems. They also include measures protecting their decision-making processes, information, and information systems. Information operations must have specifically predefined goals and targets; therefore, careful planning is part of the process. Information operations are conducted within an information environment in which they affect all its three dimensions: physical, informational, and cognitive. In the physical dimension, we think about information infrastructure; information collection, transmission, processing, and delivery systems and devices can be affected, as well as command and control facilities, ICT, and supporting infrastructure. This dimension also covers human beings. What is important is that the physical dimension is not connected exclusively to military or nation-based systems and processes. Even though we are considering the military arena, civilians and civil infrastructure is included in information operations as well. In the information dimension, we think of information itself and its content and flow. When command and control facilities are affected in the physical dimension, the same feature may also be targeted in the information dimension but from the perspective of the type of collected information, 20 M. GREGOR AND P. MLEJNKOVÁ its quality, content, and meaning. In the information dimension, information operations target the ways information is collected, processed, stored, disseminated, and protected. Last but not least, in the cognitive dimension, information operations influence the minds of those who transmit, receive, respond to, or act upon information. The cognitive dimension means individuals and groups, their individual and cultural beliefs, norms, vulnerabilities, motivations, emotions, experiences, education, mental health, identities, and ideologies (United States—Joint Chiefs of Staff 2014; Miljkovic and Pešic 2019; Vejvodová 2019). Understanding these factors is crucial for developing the best operations in order to influence decision-makers and produce the desired effect. From the above description, it is clear that thinking about information operations as a collection of single pieces of information would be too reductive. They are complex processes of activities integrating the effects of information activities (collection, creation, transmission, and protection) leading to influence over an adversary and the attainment of goals. Information operations integrate a wide spectrum of activities, such as psychological operations, operations security, information security, deception, electronic warfare, kinetic actions, key leader engagement, and computer network operations. All together, they target the will of adversaries, their understanding of the situation, and their capabilities. Information activities aimed at influencing the adversary focus mainly on decision-makers who have the ability to influence the situation. Activities in this case include questioning the legitimacy of political leaders, undermining the morals of the population or, in military terms, the morals of the military, polarising society, and so on. Information activities intended to affect the understanding of the situation seek to influence the information available to the enemy for their decision-making processes. This includes disseminating disinformation, using military-scale mock-ups to fool the enemy’s radar systems, deliberately leaking distorted information, destroying or manipulating information inside the opponent’s information systems, and so forth. The third kind of information activities are enacted upon the enemy’s abilities and are meant to disrupt their ability to understand information and promote their will. These include internet connection disruptions, physical destruction of infrastructure, and cyberattacks. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 21 In this approach, information operations can be subdivided as information-technical and information-psychological activities. Information-technical activities represent, for example, attacks on critical infrastructure, attacks on elements of information infrastructure, or cyberattacks. By information-psychological operations, we mean management of an adversary’s perception, propaganda, disinformation campaigns, psychological operations, and deception. It is manipulation with or by information in a cognitive manner. That returns us to the reason why we classify psychological operations as a tool in information operations. Let’s move for a moment to another relevant term often mentioned by political representatives and decision-makers when discussing disinformation and propaganda, typically in relation to elections, the promotion of particular interests in society, the external polarisation of society through existing inner conflict lines, or in relation to the question of national security—influence operations. Influence operations are coordinated, integrated, and synchronised activities which use diplomatic, informational, intelligence, psychological, communication, technical, military, cultural (identity-based), or economic tools in order to influence the attitudes, behaviour, and decisions of a predefined target audience so that the audience supports the goals of the actor performing the influence operations. In principle, influence operations aim to promote, undercut support, or a combination of both in order to create a space which can be filled with a desired solution. They can be developed in peacetime, in times of crisis, conflicts, and post-conflict periods (Brangetto and Veenendaal 2016; Larson et al. 2009). They are carried out in both physical and digital space. Notoriously used as an example of influence operations is the rivalry between the Russian Federation and United States. With its origin in the Cold War, it continues currently through the partial use of previously tested patterns and tools as well as under new conditions generated by the onset of the online information environment and technological progress, which has been developing novel techniques and methods of exploitation and influencing. Tromblay (2018) describes aspects of influence collection as information on and exploitation of vulnerabilities outside formal diplomacy, the exploitation of lobbyists, university and academia idealism, cultural/ethnic/religious affinities (e.g. China’s concept of ‘overseas Chinese’—Chinese people living abroad as a block beholden to China), and the media. 22 M. GREGOR AND P. MLEJNKOVÁ Influence operations is an umbrella term for all operations in the information domain, including activities relying on so-called soft power tools. However, influence operations are not exclusively soft power tools; due to infiltration into and disruption of information systems, secret and disruptive activities are also part of influence operations. Soft power tools and influence activities overlap, but they are not synonymous. Soft power includes overt activities meant to sell an agenda. Covert influence includes activities meant to disrupt processes (Tromblay 2018). In the military field, they can be used during military operations and serve to weaken the will of the adversary, intimidate or confuse decision-makers, influence (disrupt, modify, or shut down) their information systems, weaken public support for the adversary, or attract audiences to support their activities. Victory can be achieved without firing a single bullet. Similar to information operations, influence operations affect all three dimensions of the information environment as well. They affect information systems, the content of information, and how information affects the target audience, creating space for the implementation of disinformation and propaganda as well. Influence operations use activities with both an information-technical and information-psychological character. This may sound interchangeable with the definition of information operations; however, that would be a misleading perception. First, influence operations integrate a much broader scope of tools than information operations. Information operations are, therefore, a subset of influence operations, as already mentioned. Second, information operations are limited to military operations. Influence operations are conducted as coordinated activities in both civil and military affairs, and, in the case of civil affairs, they are performed regardless of peace or war, usually in relation to the projection of (geo)political power. Although hierarchy and relations among defined terms may seem complicated, Figs. 1.1 and 1.2 can provide us with a clear picture; however, to understand them, it is necessary to distinguish between civil and military affairs, and between times of peace and conflict insomuch as each term may play a different role. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 23 Fig. 1.1 Hierarchy of terms in civil affairs (Source Authors) 1.5 What Is New in the Twenty-First Century, and Where Are My Facts? The year 2016 and the presidential election in the United States are associated not only with the advent of fake news but also the point at which we began to increasingly talk about the decline of people’s concern with facts, verified sources, and truthful information. More and more, people believe widely circulating conspiracy theories, lies, and manipulation. This phenomenon is known as the post-truth era, which, not coincidentally, became the Oxford Dictionary’s 2016 word of the year. Post-truth can be described as a manifestation of a ‘qualitatively new dishonesty on the part of politicians who appear to make up facts to suit their narratives’ (Mair 2017, 3). Post-truth is leading to ‘the diminishing importance 24 M. GREGOR AND P. MLEJNKOVÁ Fig. 1.2 Hierarchy of terms in military affairs (Source Authors) of anchoring political utterances in relation to verifiable facts’ (Hopkin, Rosamond 2017, 1–2). When voters accept such a mindset, politicians can lie to them without blushing and voters will even appreciate it. This can explain the current political situation observable in the United States and elsewhere. Presidential elections in America are the most widely followed, and this most visible case of fake news propagation has spread, in addition to other post-truth symptoms, around the world. However, a similar trend could be found in other countries even prior to 2016. This is not an issue connected only to politicians. It refers to obvious lies being routine across a section of society. The traditional standard of truthfulness has lost its importance—the distinction between factual truth and falsehood 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 25 has become irrelevant; only public preferences for one set of facts matter (Kalpokas 2020, 71). Similarly, at the academic level, the role of truth and its questioning was being discussed beforehand (see Bufacchi 2020). Today’s situation is different from the cliché that all politicians lie and make promises they do not keep. Of course, politics and especially the electoral campaign periods have always been characterised by efforts of candidates to present reality framed so as to advantage them. Electoral campaigns are conducted to persuade people; they cannot be confused with education, no matter how informative they seem to be sometimes. Political communication and marketing cannot fulfil the educational role we know from the presentation of objective and impartial information. As a rule, however, political proclamations in the past bent the truth within the boundaries set by the rules of the political game in democratic contests. If politicians were caught lying, they were usually forced by party members, political competitors, the media, or the voters to explain their standpoint and provide apologies (Sismondo 2017). If they were unable to do so, they often had to give up their candidacy or elected office—Richard Nixon could tell the story. Such a mindset is not typical of everyone anymore. There is a significant part of society—not just politicians but citizens and voters as well—which does not rely on facts for truth. Post-truth is for politics a qualitatively new game. Facts are not simply twisted or omitted to disguise reality, but, instead, new realities are discursively created to serve a political message (Kalpokas 2020, 71). The prefix post in post-truth does not allude to a chronological reference that something occurs after truth. Instead, it indicates the fact that truth is no longer essential and has been superseded by a new reality (Bufacchi 2020). Hence, we can speak about a post-truth or post-factual society. We are facing a situation in which a significant part of the population has abandoned the conventional criteria of evidence, internal consistency, and fact-seeking. In a post-truth world, the principle of honesty as the default position and moral responsibility for one’s statements is something some politicians no longer hold (Higgins 2016). Many among the electorate seem not to register the troubles stemming from the fact that politicians are lying to them. This is probably because they believe their favoured candidate’s intentions are good, and they would not be deliberately misleading (Higgins 2016). Therefore, today’s politics can be described as a competition to choose which ‘truth’ can be considered the most salient and more important. The question 26 M. GREGOR AND P. MLEJNKOVÁ of which claims can be considered true and false seems to have been sidetracked, no matter how important the consequences these choices have (see Sismondo 2015). This is a prominent attribute of today’s politics, and it is the essence of the post-truth era: it empowers people to choose their reality; a reality where evidence-based and objective facts are less important than people’s already existing beliefs and prejudices. The phenomenon has sometimes been labelled by the buzzword term ‘alternative facts’. However, post-truth claims or so-called alternative facts do not seek to establish a coherent model of reality. Instead, they serve to distract the public from unwanted information or potentially unpopular policy actions, which as a result of the distraction can go uncommented on or unnoticed. Moreover, they are able to destroy trust in facts and reality to the point where facts no longer matter or are not even acknowledged to exist. Instead, people seek affirmations even when they know they are being misled; people wish to believe them (Colleoni et al. 2014; Lewandowsky et al. 2017). This can be amplified by politicians who incorporate misleading information and deception into their arsenal as an adequate way of gaining power. To this portion of the political establishment and society, lying is not only accepted but also rewarded. As Lewandowsky et al. (2017) add, falsifying reality is no longer about changing people’s beliefs, it is about gaining and sustaining power. Thus, the post-truth era is characterised by the shift from persuasion to relativisation, sometimes leading to apathy. It is no surprise that disinformation and propaganda have found more than favourable conditions for their development and dissemination these days, irrespective of how easy it is to find facts and evidence-based information—there are some audiences which are just not interested in them. In the political sphere, there are frequent discussions about historical periods and interpretations of historical events. Democratic representatives and experts in the post-communist region struggle with those who present an ‘alternative interpretation of history’, mainly in the context of the installation of communist governments after the end of World War II with the support of the Soviet Union. For example, in Czechoslovakia, the installation of the communist regime was preceded by the abuse of political power and the intimidation of political opponents by Communist representatives. However, the current interpretation of revisionists (e.g. Communist Party representatives supported by some Russian media channels) is framed as democratic. The same could be said even of the 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 27 Holocaust, a very well-documented atrocity. We are still witnessing discussions when doubts about single aspects are expressed, or the Holocaust as a whole is denied. This perception of information, facts, and the evidence does not concern only the political views and attitudes of voters, it grows into other spheres of our lives. The growing number of people who believe in conspiracy theories, hoaxes, and disinformation in the health care or nutrition fields only prove that. We no longer wonder how many people still believe the measles vaccine causes autism. We have become used to it. A few years ago, we would not have thought that at the beginning of the twenty-first century the number of people believing the Earth was flat would be increasing. Today, however, we may be shocked by this, but it is a fact. Yes, a real fact, not an alternative fact. What do these cases have in common? Distrust of scientists, experts, and professionals and their knowledge and skills. Relativism and the tendency to misuse, misinterpret, or question facts and research is nothing new. New is the degree and intensity with which it is happening. Discussions of facts focus on what is right and what is wrong. But for some, this is simply not relevant (Ihlen et al. 2019). An indispensable condition for a relationship among the general public, politicians, and experts, like almost all relationships in a democracy, is trust. The general public trusts politicians in their political decisions, while politicians trust experts whose knowledge provides them with the base for political decision-making. Thus, directly or indirectly, the general public must trust experts. When that trust collapses, when the public or politicians no longer trust experts, democracy itself can enter a death spiral which presents an immediate danger of decay either into mob rule or towards an elitist technocracy. Both are authoritarian outcomes (Nichols 2017, 216). While mob rule is not observable in today’s democracies directly, there are cases of politicians dominating the political scene who, in their words, represent the common people and define themselves in opposition to experts (not just elites as in the case of populism). In fact, these cases are on the rise in the twenty-first century, and post-truth is one of the main causes. What is also surprising is the high level of public tolerance for the inaccurate, misleading, or false statements of fact deniers. Since you are reading this chapter, you may not be surprised at this point that it is not unusual for these deniers of reality to appear on television screens or other media side by side with scientists. There are serious discussions about who is right and who is wrong and, moreover, whether fact deniers just have 28 M. GREGOR AND P. MLEJNKOVÁ a different perspective or opinion. Indeed, the blurring of the differences between objectively recognisable facts and the presentation of one’s own opinion is another typical feature of the post-truth era. ‘Don’t bother me with facts’ is no longer a punchline, it has become a political stance. (Higgins 2016). 1.5.1 The Journey We Have Taken to the Post-Truth Era It is obvious that the post-truth era has not appeared out of nowhere. When we consider the factors which caused its onset, we can distinguish three main categories: psychological factors affecting our daily work with information, societal changes and transformation, and technological progress affecting both the media and our lives as such. We cannot say which one of these is predominant. They are interconnected and all of them are crucial for understanding why we are dealing with the post-truth era, and why it is happening at the beginning of the twenty-first century. It is not our intention to analyse them in detail—each of them would require a book on their own—the aim of the following paragraphs is to provide readers with at least a basic introduction of them so that they can gain a better understanding of why we are dealing with post-truth now. This will also provide readers with the raison d’être of the following chapters of this edited volume. Every human being must deal with biases when receiving and processing information and news (Nickerson 1988). No one is impervious to them; only if we are aware of them can we minimise them by knowingly changing our habits and behaviours associated with receiving news and information. Both our current views and attitudes are factors which play a significant role in how we receive new information. They have an impact on communication in general. We consciously and unconsciously try to avoid communication which contradicts our beliefs, and, on the contrary, we seek messages which confirm our already established attitudes. The new information one is able to accept tends to entrench further the fault lines of existing perspectives (O’Shaughnessy 2020, 60). When exposed to inadequate information, we often ignore it, reformulate it, or interpret it to support our existing attitudes. We also usually forget information we disagree with more quickly in comparison to the information which agrees with us. These processes are called selective exposure, selective perception, and selective memory. Selective exposure is a tendency to expose ourselves to information or news which are in line with our current 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 29 attitudes while avoiding those we disagree with. Selective perception for change embodies the phenomenon of a person projecting what we want to see and hear into what we see or hear. Selective memory makes it easier for us to recall from our memory information which supports our views (see Freedman and Sears 1965; Hart et al. 2009; Stroud 2010; Zillmann and Bryant 2011). Another principle affecting our attitude to information is cognitive dissonance. Festinger (1957) argues that if we have two or more pieces of information which are incompatible, then we feel a state of discomfort—dissonance. The information we have and believe in should be in harmony to make us feel comfortable. Therefore, in the case of any discord, we must act. Nevertheless, this state of mental discomfort does not only occur when the information we have does not match. There is also discomfort which arises after a decision is made, when there is discord between attitude and behaviour, and when there is discomfort from disappointment. The discomfort is caused by two or more relevant pieces of information, and the discomfort experienced is higher the more important the information. The state of discomfort itself is perceived as unpleasant and can be reduced, for example, by reducing the significance of non-conforming information or re-evaluating it. We can also add information which confirms our initial opinion to eliminate the discomfort (see Harmon-Jones 2002). Confirmation bias is probably the best known and most widely accepted notion of inferential error to emerge from the literature on human reasoning (Evans 1989, 41). However, there is in fact a set of confirmation biases rather than one unified confirmation bias (see Nickerson 1998). There are often substantial task-to-task differences in these observed phenomena, their consequences, and the underlying cognitive processes we are able to identify. Generally, though, their direction is a tendency for people to believe too much in their favoured hypothesis. The term confirmation bias refers to looking for the presence of what we expect as opposed to looking for what we do not expect, to an inclination to retain or a disinclination to abandon a currently favoured hypothesis (see Jones, Sudden 2001; Knobloch-Westerwick and Kleinman 2012). Societal changes represent the second category affecting our perception of information and news. It is natural for human beings to evolve and change. Likewise, the values and conventions which prevail in society evolve and change. This is nothing new or a characteristic purely found in recent decades or the last century. However, the specific changes play 30 M. GREGOR AND P. MLEJNKOVÁ to the cards of post-truth. First of all, in Western democracies, there has been an observable decline in civic engagement, trust, and goodwill since the 1960s. Values which have encouraged people to discuss, exchange opinions, and seek consensus are increasingly less relevant to people. This trend leads us to stay in contact with people with similar attitudes, social status, political views, and hobbies. With our enclosure into social bubbles, there has also been a shift of values and life goals among the majority of people. The will to help the environment or interest in the philosophy of life has declined significantly since the 1970s. On the other hand, the importance of being well-off financially has risen over the same period (Twenge et al. 2012). Although we may have seen increased environmental concern in recent years, the question is how this trend will be affected by the COVID-19 pandemic and the subsequent economic development of individual countries. We cannot predict since this text is being written in the middle of the pandemic. These changing life goals have unexpectedly been accompanied by growing inequality. As Lewandowsky et al. (2017) note, it is a paradox that at the same time as money has become more prominent, real income growth for young Americans has largely stagnated since the 1960s. Nor is this a problem for youth in just the United States. It is something the younger generation must deal with in most of the democracies. The ability to become independent and acquire their own housing is more difficult for millennials than to the generation of their parents. This generation gap leading to inequality is associated with affective political polarisation, which means that members of political parties tend to view members of their political party more positively and those of opposing parties more negatively than ever before (see Abramowitz and Webster 2016). Party affiliation or, more precisely, ideological preference is also the link to another societal effect: politically asymmetric credulity. Recent studies have shown that there are cognitive and psychological differences between liberals and conservatives in the way they access information and evaluate their relevance or trustworthiness: conservative voters tend to believe in disinformation and false information more often than liberal voters (see Jost 2017). This societal effect is represented by a declining trust in science as already mentioned above. Naturally, when considering individual specific cases of trust in disinformation and alternative facts, not all of these societal effects or changes must be present or fulfilled. However, 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 31 looking at the current political developments on the national and international level, it is not easy to free oneself of the impression that these are often exemplary cases of the effects mentioned here. Last but not least, there are technological changes. Probably the most visible technological development affecting our perception of information are the changes in the global media landscape which have led to fragmentation. The advent of the Internet, especially Web 2.0, produced sources of information tailored to each user’s needs and wishes. In light of the fractionalisation of the media landscape into echo chambers (e.g. Jasny et al. 2015), many people believe their opinions, however exotic or unsupported by evidence, to be widely shared, thereby rendering them resistant to change or correction (Lewandowsky et al. 2017). Thus, alternative facts have increasingly moved in from the outskirts of public discourse. Conspiracy theories and disinformation gain more attention from mainstream audiences than ever before (Webb and Jirotka 2017). If traditional and mainstream media does not provide us with the information confirming our opinion and biases, we can look for media which does provide it—even if it is bullshit. Furthermore, this heterogeneity contributes to the incivility of political discourse. The media is trying to supply more shocking news or frame them in ever catchier ways to get our attention. Social media is a phenomenon in and of itself. It polarises society through algorithms which embody the essence of its functioning. Functionality based on the logic of consumer behaviour, where we are offered products and services which are evaluated as attractive to us based on our previous behaviour, can be appreciated when talking about goods. However, it is highly problematic in the field of news and political content. Users are presented with content which reflects their view of the world (O’Shaughnessy 2020); therefore, they are less and less confronted by other views. The willingness to speak to people with a different opinion is declining. Because why should we even bother to do so? In order to filter inconvenient information (or data smog), people naturally tend to belong to an ‘information group’—people who are ‘marked by allegiance to particular types and sources of information, to particular modes of problem perception and solution’ (Marshall et al. 2015, 23). Belonging to an information group saves our time. Consciously or not, we are getting closed in our filter bubbles or echo chambers—communities where we like to confirm our opinion regardless of whether it is based on real facts or not—without being exposed to counter arguments (Ihlen et al. 2019). 32 M. GREGOR AND P. MLEJNKOVÁ Since we are like–minded people in these echo chambers, the chance that someone will question false information is declining. People tend to believe their opinions are widely shared regardless of whether or not they are actually widely shared (Lewandowsky et al. 2017). When people believe this, they are more resistant to changing their minds, less likely to compromise, and more likely to insist on their own views (Leviston et al. 2013). Of course, people can belong to more than one information group, hearing more than one echo chamber with varying degrees of commitment or awareness (Marshall et al. 2015, 24). Another effect connected to social media is the roughness of discussions. We do not see those with whom we are conversing with on social media. It is, therefore, easier to slip into rougher language. In the same way, trolls and bots have a similarly vulgarising effect. Until now, the focus has mainly been on users, citizens, or voters. However, the technological giants of the online world have also had an undeniable impact on the current situation, as demonstrated in the strengthening effects of algorithms. In addition to social media platforms like Facebook, YouTube, and Twitter, Google and its targeted ads are among the most influential. It is enough for a user to read an article on a disreputable website a few times, and search engines and ads already recommend similar resources to them. As with shopping behaviour, when receiving information, the Internet players offer ‘similar products’. Therefore, paradoxically a small number of technological ‘superpowers’ have come to dominate the global spread of information and affect not only the way we consume information but also what information and from what sources (Webb and Jirotka 2017). 1.5.2 Infodemic—Post-Truth on Steroids The year 2020 and the COVID-19 pandemic has brought—among other things—a further acceleration of the spread of misleading information. Misleading health information, such as ‘guaranteed’ advice and tips on how to treat the disease, has become a new issue in particular (Lilleker et al. 2021). The term ‘infodemic’ has become widespread, highlighting the danger of the misinformation phenomenon throughout the management of virus outbreaks since it has the ability to even accelerate the spread of the novel coronavirus by empowering reluctance towards the social response. The pandemic has shown how crucial and important a 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 33 role social media plays in the new information environment (Cinelli et al. 2020, 2). Users increasingly see trusted sources of information within their peer networks and further share their statements. As that information is further spread, it often increases in its perceived legitimacy. This method of sharing and validating information contrasts with methods implemented by mass media, who have specialised knowledge and specific responsibilities related to information verification and sharing. During the COVID-19 pandemic, individuals have, not surprisingly, been turning to this new digital reality for guidance (Limaye et al. 2020, 277). We can see this trend all around the world, with countries and even supranational organisations, such as the European Union or the World Health Organisation (WHO), having faced it (see Lilleker et al. 2021). It would not be correct to state that social media has served exclusively the dissemination of misleading information. There were numerous cases when social media helped to empower and support hospital staff as well as information campaigns on epidemiological recommendations or governmental measures. The latter two especially represent a battlefield where social media has met mass media and fought for the audience’s attention, for the privilege of who gets to inform the general public. Even though users share information on social media, ‘old school’ techniques of (political) communication have proved that they do not yet belong in the scrapheap. Mass media has played a crucial role mainly in two aspects: (1) mediation of press conferences and speeches of government members to the masses, and (2) the selection of experts who provide comments based on their expertise. The phenomenon, which is not new but has been boosted by the pandemic, is the comments and statements of self-proclaimed experts or authorities with expertise from fields not correlated to the pandemic. These statements have been widely circulated on social media and have often undermined the official statements made by the government. ‘Scientific misinformation has been actively propagated as a means to destabilise trust in governments and as a political weapon’ (Limaye et al. 2020, 277), and mass media, at least to some degree, has played the role of moderator, returning the debate to acceptable norms with eligible speakers. Although social media has strengthened its power during the pandemic, mass media has shown itself to be crucial if verified and reliable information is needed. Nevertheless, even mass media has not gone without blemish and has been criticised, for example, for providing space to ‘erroneous’ experts. 34 M. GREGOR AND P. MLEJNKOVÁ ***** People usually hear what they want to hear because they get their news exclusively from sources whose bias they agree with. A source provides the framing for the information. It tells the consumer whether the information is likely to be agreeable and conform to expectations (Marshall et al. 2015, 24). It is not decisive how trustworthy the source is according to the objective criteria—we recognise the source as trustworthy when it provides likeable ‘facts’, no matter how ‘alternative’ they are. Although some authors expect that echo chambers boosted by computer algorithms to select news on Web 2.0 (and social media especially) will have an effect on both solid news and fake news (see Sismondo 2017), recent data shows that disinformation and lies spread faster and to a broader audience than truth—a product of the different predominant emotions both types of news evoke (Vosoughi et al. 2018). However, no matter what side people choose, whether to trust facts or alternative facts, both ways play into approaches which treat voters as people to be manipulated rather than convinced (Sismondo 2017). Lies and bullshit have always been among us, but technologies, such as social media; the growth of so-called alternative media; and even bots have elevated them to a common cause, which a part of our society is unashamed of, elevating it even to a political attitude. Verifying information and debunking lies and disinformation is then perceived as a restriction on freedom of speech and the enforcement of political attitudes to the detriment of others. All these trends have encouraged politicians to strategically target their supporters with radical and, in some cases, even extremist messages. At a time when people are overwhelmed with information, the political struggle is often won not by the politician who is able to compromise and use substantive reasoning, but the one whose tweets and Facebook statuses are more visible and striking. And while you are reading the explanation of what the post-truth era is, more and more people argue on Facebook about blatant lies which are presented and defended as their opinion. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 35 1.6 Conclusion On the previous pages, we tried to explain what propaganda and disinformation are, how they differ from other forms of false or misleading information, and how they fit into the broader concept of influence operations. The way we receive information and news has changed radically over the past decade, which is also linked to the emergence of the post-truth era or, more recently, the so-called infodemic. Although there is a lack of consensus on the universally accepted definition of propaganda, the individual definitions are characterised by standard features: (1) systematically and intentionally disseminated (2) ideas and information (3) are inaccurate, distorted, false, or omitted (4) and intended to influence or manipulate the public (5) to harm the opponent. Disinformation represents a conceptually more straightforward phenomenon (intentionally misleading information). However, it is often confused with other forms of deception. The form and strategy of propaganda are influenced by the means which propaganda can use—propaganda depends on the tools and channels of information dissemination on offer. Today, it has taken the form of using the Internet and social media so anyone can write what they want, anyone can be a journalist, or, more generally, be a creator of popular content. This means, among other things, an overloading of the information environment and the individual as well. Laswell’s, Ellul’s, and Taylor’s definitions of propaganda as introduced in this chapter remain valid, but the nature of the information environment and the means through which we can use in it have changed, affecting the strategies that lead to the fulfilment of those still valid goals for which both propaganda and disinformation are used. Manipulative techniques remain the same, although the frequency varies in the use of particular techniques. What does not change is the essential role of emotions. Today’s propaganda and disinformation do not necessarily aim to persuade; another strategy used is the effort to overload a targeted audience with information and thus induce relativisation, which suits the post-truth era. In the post-truth era, three primary shifts can be identified: (1) a lie is elevated to political opinion, and if the facts do not fit into our vision of the world, we choose other, alternative facts which do; (2) technologies accelerate our enclosure into echo chambers and opinion ghettos; and, paradoxically, (3) politicians who benefit from posttruth rely on people’s tendencies to believe that others are telling (at least 36 M. GREGOR AND P. MLEJNKOVÁ mostly) truth, and they exploit it. Moreover, the post-truth era is often characterised by an intertwining of political ambitions and politicians with propaganda narratives and disinformation. The question is which came first, the chicken or the egg? Are the information overloading and relativisation strategies a characteristic of the post-truth era or were they just an advantageous condition which facilitated the onset of the post-truth era? Did the post-truth era enable this strategy to be applied, or did the use of the strategy lead to its development? The purpose of this chapter was not to answer such questions but instead seek to highlight that these new strategies respond to the fact that society has access to nearly unlimited information which is affected by strategies of applied propaganda and disinformation. It is challenging to control or even stop the flow of information unless we create a downright new and controllable information environment. Thus, the new method of manipulation is to create the illusion that things are not as they seem, that there are always alternative facts for everyone. With the change in the information environment, so too have the strategies changed. We have moved from persuasion to manipulation through the sheer amount of information and the presentation of information irrelevant to the discourse and, therefore, to its relativisation. Bibliography Abramowitz, A. I., & Webster, S. (2016). The Rise of Negative Partisanship and the Nationalization of U.S. Elections in the 21st Century. Electoral Studies, 41(March), 12–22. https://doi.org/10.1016/j.electstud.2015.11.001. Akopova, A. S. (2013). Linguistic Manipulation: Definition and Types. International Journal of Cognitive Research in Science, Engineering, and Education, 1(2), 78–82. Allcott, H., & Gentzkow, D. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211–236. Atton, C. (2009). Alternative and Citizen Journalism. In K. Wahl-Jorgensen & T. Hanitzsch (Eds.), The Handbook of Journalism Studies (pp. 265–278). New York and London: Routledge. Bates, R. A., & Mooney, M. (2014). Psychological Operations and Terrorism: The Digital Domain. The Journal of Public and Professional Sociology, 6(1). https://digitalcommons.kennesaw.edu/cgi/viewcontent.cgi?art icle=1070&context=jpps. Accessed 22 Mar 2020. Baym, G. (2005). The Daily Show: Discursive Integration and the Reinvention of Political Journalism. Political Communication, 22(3), 259–276. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 37 Berliner, D. C. (1992, February). Educational Reform in an Era of Disinformation. Annual Meeting of the American Association of Colleges for Teacher Education. San Antonio, TX, USA. https://files.eric.ed.gov/fulltext/ED3 48710.pdf. Accessed 22 Mar 2020. Bernays, E. L. (1947). The Engineering of Consent. The Annals of the American Academy of Political and Social Science, 250(1), 113–120. https://doi.org/ 10.1177/000271624725000116. Bernays, E. L. (2004). Propaganda. New York: IG Publishing. Bor, S. E. (2014). Teaching Social Media Journalism: Challenges and Opportunities for Future Curriculum Design. Journalism & Mass Communication Educator, 69(3), 243–255. https://doi.org/10.1177/1077695814531767. Brangetto, P., & Veenendaal, A. M. (2016). Influence Cyber Operations: The Use of Cyberattacks in Support of Influence Operations. Tallinn: NATO CCD COE Publications. Brazzoli, M. S. (2007). Future Prospects of Information Warfare and Particularly Psychological Operations. In L. Le Roux (Ed.), South African Army Vision 2020: Security Challenges Shaping the Future African Army (pp. 217–234). Pretoria: Institute for Security Studies. Bufacchi, V. (2020). Trutp, Lies and Tweets: A Concensus Theory of Post-Truth. Philosophy and Social Criticism, 1–15. https://doi.org/10.1177/019145371 9896382. Cilluffo, F. J., & Gergely, C. H. (1997). Information Warfare and Strategic Terrorism. Terrorism and Political Violence, 9(1), 84–94. Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L., et al. (2020). The COVID-19 Social Media Infodemic. Scientific Reports 10. Colleoni, E., Rozza, A., & Arvidsson, A. (2014). Echo Chambers or Public Sphere? Predicting Political Orientation and Measuring Political Homophily in Twitter Using Big Data. Journal of Communication, 64(2), 317–332. https://doi.org/10.1111/jcom.12084. Damasio, A. (2010). Self Comes to Mind: Constructing the Conscious Brain. New York: Pantheon. Domatob, J. (1985). Propaganda Techniques in Black Afrika. International Communication Gazete, 36(3), 193–212. Dortbudak, M. F. (2008). The Intelligence Requirement of Psychological Operations in Counterterrorism. Monterey: Naval Postgraduate School. https://cal houn.nps.edu/bitstream/handle/10945/3856/08Dec_Dortbudak.pdf?seq uence=1&isAllowed=y. Accessed 22 Mar 2020. Dretske, F. I. (1983). Précis of Knowledge and the Flow of Information. Behavioral and Brain Sciences, 6(1), 55–90. Ellul, J. (1973). Propaganda: The Formation of Men’s Attitudes. New York: Vintage Books. 38 M. GREGOR AND P. MLEJNKOVÁ Emery, N. E., Earl, R. S., & Buettner, R. (2004). Terrorist Use of Information Operations. Journal of Information Warfare, 3(2), 14–26. Evans, J. S. B. T. (1989). Bias in Human Reasoning. Hillsdale, NJ: Erlbaum. Fallis, D. (2015). What Is Disinformation? Library Trends, 63(3), 401–426. Farkas, J., & Neumayer, C. (2018). Disguised Propaganda from Digital to Social Media. In J. Hunsinger et al. (Eds.), Second International Handbook of Internet Research (pp. 1–17), Springer. Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press. Freedman, J. L., & Sears, D. O. (1965). Selective Exposure. Advances in Experimental Social Psychology, 2, 57–97. https://doi.org/10.1016/S0065-260 1(08)60103-3. Ganor, B. (2002). Terror as a Strategy of Psychological Warfare. International Institute for Counter-Terrorism. https://www.ict.org.il/Article/827/Terroras-a-Strategy-of-Psychological-Warfare#gsc.tab=0. Accessed 22 Mar 2020. Gelfert, A. (2018). Fake News: A Definition. Informal Logic, 38(1), 84–117. Goode, L. (2009). Social News, Citizen Journalism and Democracy. New Media & Society, 11(8), 1287–1305. https://doi.org/10.1177/146144480 9341393. Guilday, P. (1921). The Sacred Congregation de Propaganda Fide (1622–1922). The Catholic Historical Review, 6(4), 478–494. Harmon-Jones, E. (2002). A Cognitive Dissonance: Theory Perspective on Persuasion. In J. P Dillard & M. W. Pfau (Eds.), The Persuasion Handbook: Developments in Theory and Practice (pp. 99–116). London: Sage. http://dx. doi.org/10.4135/9781412976046.n6. Hart, W., Albarracin, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling Validated Versus Being Correct: A Meta-Analysis of Selective Exposure to Information. Psychological Bulletin, 135(4), 555–588. https:// doi.org/10.1037/a0015701. Higgins, K. (2016). Post-Truth: A Guide for the Perplexed. Nature, 540(9). https://doi.org/10.1038/540009a. Hopkin, J., & Rosamond, B. (2017). Post-Truth Politics, Bullshit and Bad Ideas: ‘Deficit Fetishism’ in the UK. New Political Economy, 23(6), 641–655. Ihlen, Ø., Gregory, A., Luoma-aho, V., & Buhmann, A. (2019). Post-Truth and Public Relations: Special Section Introduction. Public Relations Review, 45(4), 1–4. https://doi.org/10.1016/j.pubrev.2019.101844. Jasny, L., Waggle, J., & Fisher, D. R. (2015). An Empirical Examination of Echo Chambers in US Climate Policy Networks. Nature Climate Change, 5, 782–786. https://doi.org/10.1038/nclimate2666. Jones, M., & Sugden, R. (2001). Positive Confirmation Bias in the Acquisition of Information. Theory and Decision, 50, 59–99. https://doi.org/10.1023/ A:1005296023424. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 39 Jost, J. T. (2017). Ideological Asymetries and the Essence of Political Psychology. Political Psychology, 38(2), 167–208. https://doi.org/10.1111/pops.12407. Kalpokas, I. (2020). Post-Truth and the Changing Information Environment. In P. Baines, N. O’Shaughnessy, & N. Snow (Eds.), The Sage Handbook of Propaganda (pp. 71–84). London, Thousand Oaks, New Dehli and Singapore: Sage. Kaufhold, K., Valenzuela, S., & Gil de Zuniga, H. (2010). Citizen Journalism and Democracy: How User-Generated News Use Relates to Political Knowledge and Participation. Journalism and Mass Communication Quarterly, 87 (3–4), 515–529. https://doi.org/10.1177/107769901008700305. Knobloch-Westerwick, S., & Kleinman, S. B. (2012). Preelection Selective Exposure: Confirmation Bias Versus Informational Utility. Communication Research, 39(2), 170–193. Kragh, M., & Åsberg, S. (2017). Russia’s Strategy for Influence Through Public Diplomacy and Active Measures: The Swedish Case. Journal of Strategic Studies, 40(6), 773–816. Larson, V. E., et al. (2009). Foundation of Effective Influence Operations. A Framework for Enhancing Army Capabilities: RAND Corporation. Lasswell, H.D. (1927). The Theory of Propaganda. American Political Science Review, 627–631. https://doi.org/10.2307/19455515. Leviston, Z., Walker, I., & Morwinski, S. (2013). Your Opinion on Climate Change Might Not Be as Common as You Think. Nature Climate Change, 3, 334–337. https://doi.org/10.1038/nclimate1743. Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. Lewis, S. C., Kaufhold, K., & Lasorsa, D. L. (2010). Thinking About Citizen Journalism: The Philosophical and Practical Challenges of User-Generated Content for Community Newspapers. Journalism Practice, 4(2), 163–179. https://doi.org/10.1080/14616700903156919. Lilleker, D., Coman, I., Gregor, M., & Novelli, E. (2021). Political Communication and COVID-19: Governance and Rhetoric in Times of Crisis. London and New York: Routledge. Limaye, R. J., Suaer, M., Ali, J., Bernstein, J., Wahl, B., Barnhill, A., et al. (2020). Building Trust While Influencing Online COVID-19 Content in the Social Media World. The Lancet Digital Health, 2(1), e277–e278. Mair, J. (2017). Post-Truth Anthropology. Anthropology Today, 33(3), 3–4. Marshall, J. P., Goodman, J., Zowghi, D., & da Rimini, F. (2015). Disorder and the Disinformation Society. London and New York: Routledge. McCornack, S. A. (1992). Information Manipulation Theory. Communication Monographs, 59(1), 1–16. https://doi.org/10.1080/03637759209376245. 40 M. GREGOR AND P. MLEJNKOVÁ Merriam-Webster Dictionary. (n.d.). Operation. https://www.merriam-webster. com/dictionary/operation. Accessed 22 Mar 2020. Miljkovic, M., & Pešic, A. (2019). Informational and Psychological Aspects of Security Threats in Contemporary Environment. TEME, 43(4), 1079–1094. NATO Standardization Office. (2014). Allied Joint Doctrine for Psychological Operations. Allied Joint Publication—3.10.1. Brussels: North Atlantic Treaty Organization, NATO Standardization Office. NATO Standardization Office. (2018). NATO Glossary of Terms and Definitions. AAP-06. Brussels: North Atlantic Treaty Organization, NATO Standardization Office. Nichols, T. (2017). The Death of Expertise. New York: Oxford University Press. Nickerson, R. S. (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2(2), 175–220. O’Shaughnessy, N. (2004). Politics and Propaganda: Weapons of Mass Seduction. Manchester: Manchester University Press. O’Shaughnessy, N. (2016). Selling Hitler: Propaganda & The Nazi Brand. London: Hurst & Company. O’Shaughnessy, N. (2020). From Disinformation to Fake News: Forward into the Past. In P. Baines, N. O’Shaughnessy, & N. Snow (Eds.), The Sage Handbook of Propaganda (pp. 55–70). SAGE: London, Thousand Oaks, New Dehli and Singapore. Orwell, G. (1949). 1984. London: Secker and Warburg. Pearce, K. E. (2015). Democratizing Kompromat: The Affair Dances of Social Media for State-Sponsored Harassment. Information, Communication & Society, 18(10), 1158–1174. https://doi.org/10.1080/1369118X.2015.102 1705. Post, J. M. (n.d.). Psychological Operations and Counterterrorism. https://www. pol-psych.com/downloads/JFQ%20Psyops%20and%20counterterrorism.pdf. Accessed 22 Mar 2020. Pratkanis, A., & Aronson, E. (2001). Age of Propaganda: The Everyday Use and Abuse of Persuasion. New York: Holt Paperbacks. Shabo, M. E. (2008). Techniques of Propaganda & Persuasion. Clayton: Prestwick House. Shenk, D. (1997). Data Smog: Surviving the Information Glut. New York: Harper Collins. Silverman, C. (2016). This Analysis Shows How Viral Fake Election News Stories Outperformed Real News on Facebook. Buzzfeed. https://www.buz zfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-newson-facebook?utm_term=tgEkrDpr#.dvkvJ3DJ. Accessed 22 Mar 2020. Sismondo, S. (2017). Post-truth? Social Studies of Science, 47 (1), 3–6. https:// doi.org/10.1177/0306312717692076. 1 EXPLAINING THE CHALLENGE: FROM PERSUASION TO RELATIVISATION 41 Smith, B. L. (2019). Propaganda. Encyclopaedia Britannica. https://www.britan nica.com/topic/propaganda. Accessed 20 Mar 2020. Sparkes-Vian, C. (2018). Digital Propaganda: The Tyranny of Ignorance. Critical Sociology, 45(3), 393–409. Stilwell, R. D. (1996). Political-Psychological Dimensions of Counterinsurgency. In F.L Goldstein & B. F Findley (Eds.), Psychological Operations: Principles and Case Studies (pp. 319–332). Alabama: Air University Press. Stroud, N. J. (2010). Polarization and Partisan Selective Exposure. Journal of Communication, 60(3), 556–576. https://doi.org/10.1111/j.1460-2466. 2010.01497.x. Syed, M. (2012). On War by Deception-Mind Control to Propaganda: From Theory to Practice. ISSRA Papers, 195–214. Taylor, P. M. (2003). Munitions of the Mind: A History of Propaganda from the Ancient World to the Present Era. Manchester and New York: Manchester University Press. Tromblay, E. D. (2018). Political Influence Operations: How Foreign Actors Seek to Shape U.S. Policy Making. Lanham: Rowman & Littlefield. Twenge, J. M., Campbell, W. K., & Freeman, E. C. (2012). Generational Differences in Young Adults’ Life Goals, Concern for Others, and Civic Orientation, 1966–2009. Journal of Personality and Social Psychology, 102(5), 1045–1062. https://doi.org/10.1037/a0027408. United States—Join Chiefs of Staff. (2014). Joint Publication 3–13: Information operations. Vargo, C., Lei, G., & Amazeen, M. (2017). The Agenda-Setting Power of Fake News: A Big Data Analysis of the Online Media Landscape from 2014 to 2016. New Media & Society, 20(5), 2028–2049. Vejvodová, P. (2019). Information and Psychological Operations as a Challenge to Security and Defence. Vojenské rozhledy, 28(3), 83–96. Vosoughi, S., Roy, D., & Aral, S. (2018). The Spread of True and False News Online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science. aap9559. Webb, H., & Jirotka, M. (2017). Nuance, Societal Dynamics, and Responsibility in Addressing Misinformation in the Post-Truth Era: Commentary on Lewandowsky, Ecker, and Cook. Journal of Applied Research in Memory and Cognition, 6(4), 414–417. https://doi.org/10.1016/j.jarmac.2017.10.001. Wojnowski, M. (2019). Presidential Elections as a State Destabilization Tool in the Theory and Practice of the Russian Info-Psychological Operations in the 20th and 21st Century. Przegl˛ad Bezpiecze ´nstwa Wewn˛etrznego, 11(21), 311– 333. Zillmann, D., & Jennings, B. (2011). Selective Exposure to Communication. London: Routledge Communication Series.