The Global Story of Election Interference

Backgrounder

Ever since citizens have been communicating online with each other via emails, messenger services or social networks, the character of democratic societies has changed significantly. Great opportunities have arisen for more diversity of opinion, participation and exchange of information. And yet, risks and tendencies toward subversion are also becoming increasingly apparent. In particular, a progressive social polarization can be observed, in which virtual confrontation is displacing the open exchange of views. This tendency has opened an ominous gateway to authoritarian regimes such as Russia, China and Iran, which have mobilized enormous resources in recent years to interfere in democratic processes in numerous countries around the world.

The Global Story of Election Interference.png

Introduction

Ever since citizens have been communicating online with each other via emails, messenger services or social networks, the character of democratic societies has changed significantly. Great opportunities have arisen for more diversity of opinion, participation and exchange of information. And yet, risks and tendencies toward subversion are also becoming increasingly apparent. In particular, a progressive social polarization can be observed, in which virtual confrontation is displacing the open exchange of views. This tendency has opened an ominous gateway to authoritarian regimes such as Russia, China and Iran, which have mobilized enormous resources in recent years to interfere in democratic processes in numerous countries around the world.

Foreign election interference has thus become a major threat to the universal right of all people to participate in the democratic process. It is a global phenomenon that has been observed in recent years in countries from Mexico to North Macedonia, from Ukraine to Kenya, from Taiwan to Georgia. And it has put the question of reshaping and regulating the digital space at the center of the political agenda of many democratic governments and parliaments.

With the most prominent example of a foreign election interference campaign - the 2016 U.S. presidential election - "election interference" has become a political term in the vocabulary of politicians, analysts, and journalists worldwide. In particular, the meddling of Russian President Vladimir Putin in elections in the U.S. and other democracies around the world – though he has consistently denied his involvement – has been clearly documented by numerous investigations and studies. Russia is far from the only state actor to use the tools of election interference. In recent years, the governments of China, Iran, Turkey, and other regional powers have also used illegitimate interference as part of their operational foreign policy to advance their interests in other states.

Often, the main goal is not exclusively to influence election results, but rather a strategy to erode citizens' faith and trust in the legitimacy and integrity of their democratic elections, institutions and representatives. To understand the full context of this new element in the ongoing systemic struggle between the free and democratic world and its opponents, this article will address a number of fundamental questions.

What is election interference?

In the coverage of democratic elections in recent years, the terms “election integrity” and "election interference" regularly pop up. But what is the story behind them? For purposes of this publication, we have defined the term “election integrity” as the inviolability of the basic principles of a democratic election, namely that it must be "free, secret, and equal." Any attempt to manipulate the electoral process or the election results constitutes a violation of these principles and thus a malicious “election interference.” In a broader sense, the term refers not only to an election itself, but also to the integrity of the democratically legitimized institutions associated with it, such as the elected parliament, its deputies, and the government legitimized by its electorate.

Foreign election interference is by no means a new phenomenon; the use of disinformation to change citizens' minds or influence the outcome of elections is a strategy at least as old as printing. Already in 1924, the publication of a fake document (the "Zinoviev Letter") in one of Great Britain's most influential newspapers four days before election day significantly impacted the outcome of the United Kingdom's elections and caused an unexpected swing in opinion among British voters.

In recent years, however, foreign election interference has gone digital, and with Russia’s interference in the 2016 US presidential election and reports of foreign interference in the Brexit referendum that same year, attention to this phenomenon has increased exponentially. The list of prominent examples is rising every year. Reports of Chinese disinformation campaigns to discredit the China-critical presidential candidate in the 2020 Taiwan election, suspicions of Russian attempts to influence the 2019 election in the United Kingdom, evidence of Chinese fake accounts targeting democracy supporters in Hong Kong in 2019 and massive amplification of false narratives about the Green party and its frontrunner Annalena Baerbock in the German elections in September 2021 are only few of the recent examples where digital influence operations of this type have reached the front pages of international media. Since early 2020, we also observe a new and worrying layer to this threat: The so-called “Infodemic,” i.e. the ongoing wave of disinformation and conspiracy theories surrounding the Covid-19 pandemic, fueled not only by foreign but massively by domestic actors and networks in particular. Disinformation - the spread of false information to deliberately deceive the public - seems to have become the new "normal," and a recent study suggests that "the majority of people will see more false information than true information in 2022."[1]

What is in the election interference toolbox?

Although public debate often gives the impression that there is a universal pattern to foreign election interference operations, this impression is false, as each intervention must be carefully tailored to the specific political context and vulnerabilities of the target country. Foreign actors thus always require a particular combination of methods that must be adapted to their particular objectives. Many of these methods overlap and complement each other, with the use of one indicating the likely use of another. For example, social media influence campaigns typically involve both disinformation and sentiment amplification, often in combination with political advertising. A phishing campaign against a political campaign or politician may indicate plans for a hack and leak operation.[2]

A look into the toolbox of election influence reveals a variety of measures that can be divided into three main categories:

  1. Manipulation of information - primarily through online disinformation, but also through fabricating or manipulating political ads on social media or the automated spread of disinformation through troll or bot networks,
  2. Cyber incidents such as online attacks on parties, politicians, election authorities, election infrastructures, or hack-and-leak operations in which emails or documents are stolen and information from them is published for the purpose of manipulating public opinion shortly before elections,
  3. Funding and supporting of proxy parties and political groups ("political grooming")

Common targets of election interference are:

  • Voters: Online information is manipulated to influence voter behavior; this process of manipulation weakens trust in the democratic process and in the accuracy of the information people receive. Moreover, disinformation creates confusion and can divert attention from campaign issues.
  • Politicians & political parties: Foreign influence and cyber operations can target politicians and political parties to coerce, manipulate, and publicly discredit individuals. Hackers can use private information about a candidate or his or her party to gain control over that person. Digitally stored party and voter lists can be obtained and the information sold or used to manipulate voter turnout, such as sending voters to the wrong polling place. Party data, including emails, can be hacked and leaked to the public to damage the party or candidate's reputation. Over time, these threats can have a negative impact on the party's reputation. Moreover, they can have the chilling effect whereby the potential benefits of running for public office are outweighed by the risks of running, in particular regarding a candidate's privacy and private life.
  • Elections & the election process: The actual elections can also be targeted by cyberattacks to either steal voter data, suppress voter turnout, or manipulate the election authorities' website. Moreover, electronic-based voting systems, unlike paper-based voting systems, are susceptible to cyberattacks on ballot results.

What is the role of social media in election interference?

More than 3.8 billion people worldwide are active on social media. The most popular categories are social networks, instant messengers, weblogs and forums. Compared to traditional mass media, one of the distinctive features of social media is the large amount of user-generated content, which has led to a strong "democratization" of political communication. Social media has, for example, empowered people in non-democratic countries around the world to raise their voices and escape state censorship, such as during the “Arab Spring” in 2011.

Every user can theoretically become a source of information at any time, which has led de facto to a breakdown of the old "gatekeeper“ model, in which journalists and publishers selected and evaluate all news content based on professional journalistic standards. Since social media platforms, in contrast, were deliberately constructed as open, virtual meeting spaces without content moderation or fact checking, they offer a wide range of opportunities for abuse. Accordingly, the new dominance of social media in political communication has led to an uncontrolled and increasing polarization, aggravation and aggressiveness of the public discourse, which is largely based on the massive dissemination of disinformation, including illegal hate speech.

Algorithms play a decisive role on most social platforms. These "rules of action,” which automatically determine what content users see (or do not see), have a considerable influence on their behavior, but are largely opaque to them. In principle, the use of algorithms by social platforms should be assessed as neutral; only their concrete design determines whether they are to be considered as problematic. Particularly in the context of political decision-making in the run-up to elections, many of them in their current form harbor considerable risks of promoting problematic content for two reasons in particular:

  • Algorithms learn independently through interaction. Users are predominantly shown what they have often reacted to before. This makes it much more difficult for them to engage in new topics. Likewise, opposing opinions fade out. This creates "filter bubbles" and "echo chambers," i.e., virtual rooms of like-minded people who constantly confirm each other's views. This is an ominous breeding ground for the spread of political disinformation and conspiracy narratives. And as the storming of the U.S. Capitol on January 6, 2021, demonstrated in a terrifying way, verbal violence in online chat groups and channels may well translate into real-world harm.
  • Algorithms permanently signal: "Stick around!" Social networks such as Facebook or YouTube are predominantly ad-financed. It is therefore in the interest of the companies to get their users to spend as much time as possible on the respective platform, because that is how they earn money. To this end, algorithms favor the spreading of content that generates a lot of reactions such as likes, shares, or comments. An emotionally charged or polarizing comment, e.g. on a government's COVID-19 policy, will be displayed more frequently and spread more widely than a factually formulated news item.

But for free and unbiased elections, democratic societies depend on citizens being able to make informed choices based on accurate information. Consequently, the increasing number of disinformation campaigns on social media by foreign states and domestic actors poses a critical challenge to democracies worldwide. When voters are confronted with misrepresentations and misinformation, their sense of security is disrupted and confusion is created, risking either directly influencing the outcome of a democratic election or indirectly impacting elections by increasing social division in society or weakening trust in state institutions.

Who fills the Election Interference Library?

Of those actors that conduct the most sophisticated election interference campaigns, Russia continues to stand out, in particular regarding Russian influence operations during numerous democratic elections in Europe and the US. Next to Russia, China is also a major force in widespread disinformation campaigns and coordinated cyberattacks, especially in countries close to its “sphere of influence” such as Taiwan, Hong Kong, and even faraway target countries such as Australia. Regional powers such as Iran or Turkey engage in similar tactics, often to advance political interests in their immediate neighborhood. A new actor, especially in the context of disinformation campaigns, is right-wing and right-wing populist groups like the Trump/Bannon network, which is trying to create a global ecosystem to support right-wing populist parties and candidates in countries such as Brazil and Hungary.

The Russian Playbook

Geopolitical objectives have been identified by many experts and analysts as the main reason the Russian government engages in large-scale, coordinated, election interference campaigns, in particular in regions close to Russia such as Estonia, Georgia and Ukraine, but also in Europe and the US. The strategic objective of these influence campaigns is often the same: In its own region, the Kremlin uses targeted disinformation operations to manipulate voter opinion and harm parties and politicians critical of Russia or, conversely, to support pro-Russian parties. In the case of democratic states, Russia interferes in order to sow a fundamental and corrosive distrust of democratic institutions and their representatives in specific segments of the population. Furthermore, it purposefully polarizes segments of society by targeting their respective groups on contentious issues, further fueling their hatred of other segments of society, as seen in the 2016 and 2020 US presidential elections. During the 2016 elections, for example, social media accounts affiliated with Russia's GRU military intelligence service disseminated not only pro-Trump content, but also content supporting the goals of the Black Lives Matter movement. In so doing, each account attempted to portray the other group as a threat, which subsequently led to further polarization between specific parts of U.S. society.

In relation to countries close to Russia's sphere of influence, disinformation campaigns are also used to portray Russia in a better light and foster favorable perceptions of the country and its foreign policy. In these cases, such campaigns are often linked to specific incidents, such as the illegal annexation of Crimea or the shooting down of Malaysia Airlines Flight 17 over Ukraine in 2014.

The means by which Russia and other actors conduct such operations follow a similar format. The Russian military engages in large-scale operations to spread disinformation over social media, in particular through the use of bot networks and fake accounts. Official state media outlets such as Russia Today and Sputnik also play an important role in the dissemination of false narratives and disinformation, reaching larger online engagement rates than many national media outlets.[3] "Hack-and-leak'' operations are an additional tool to manipulate public opinion: Politicians and parties, but also journalists are hacked, for example through phishing emails, in order to gain access to sensitive data. This data – or sometimes, fabricated data – is subsequently released to the public. Whether through outlets such as Wikileaks or Russia Today, or on social media via fake accounts, disinformation is released into the traditional media ecosystem in order to further exacerbate the distrust of voters in government.

In general, the Russian playbook must be viewed as a long-term strategy. Many of the methods it contains are used to varying degrees long before elections and intensify increasingly at campaign time. The Kremlin's ongoing disinformation war against Europe, for example, shows no signs of abating, while at the same time, reports of cyberattacks are becoming more frequent.[4] This long-term manipulation strategy traces its origins to the Soviet concept of "active measures": a slow process of ideological subversion and psychological warfare over several years aimed at altering the target's perception of reality and causing it to take actions that benefit the adversary.[5] Moscow's current and ongoing efforts are a sophisticated adaptation of this strategy to the digital age.

The Chinese Playbook

Beijing has not yet fully adopted the aggressive Russian interference playbook, since, at least as of now, they shy away from deliberately interfering in Western democratic elections on a large scale. However, domestically, in its regional sphere of influence and on issues critical to its reputation around the world, China uses the tool of massive disinformation dissemination to influence public opinion in its favor. Taiwan and Hong Kong are the countries most affected by such disinformation campaigns. Both countries are in dispute with the Chinese government and are defending their independence, which is clearly at odds with Chinese interests. However, most of this disinformation is targeted at China’s own population, which does not enjoy freedom of speech or free access to independent information in the first place. For instance, the massive waves of peaceful protest in Hong Kong were portrayed in China as violent protests of terrorists.[6] On topics that independent media have difficulty fully verifying, China employs sophisticated disinformation campaigns to influence public opinion, particularly regarding the origin of Covid-19 and the detention of Uighurs in Xinjiang province. Claims that the Covid-19 outbreak was caused by a leak from a laboratory in Wuhan were not only deliberately denied, but also actively countered by spreading disinformation that, for example, the United States was the true source of the virus.[7] Similarly, Chinese disinformation narratives argued that western democracies were ill-equipped to combat the virus, while China had successfully contained the outbreak. Clearly false, this narrative nonetheless represents an important component in the broader political clash between China and the free world over which form of government provides a better life for its citizens. Beijing uses large numbers of fake social media accounts to push its messages. It has increasingly relied on the kind of trolls and bots that Russia has deployed. Chinese diplomats spread twisted and outright fake news, and the major Chinese state media spread the government's stories.

The Iranian Playbook

Teheran appears to follow the Russian playbook, albeit less professionally, as their tracks are more easily discovered by international authorities.[8] The U.S. government recently reported heightened activity of Iranian troll farms, linked to the Islamic Revolutionary Guard Corps Quds Force (IRGC). Coordinated disinformation activities are conducted in connection to specific events, such as the U.S. withdrawal from Afghanistan, the U.S. election, or the ongoing conflict between Israel and Hamas.[9] State media are also used as vehicles for the dissemination false narratives. In 2021, the U.S. Department of Justice seized 33 websites affiliated with the Iranian government that were hosted on U.S. domains in violation of international sanctions.[10] Similar to Russia, in particular regarding US domestic issues, social divisions and polarization of the public are the objectives of such activities, alongside portraying Iran in a better light vis-a-vis the global community. While it appeared that Russia actively supported the election of Trump in the 2016 election, Iran conducted operations which sought to prevent his reelection.[11] This illustrates how different geopolitical interests affect the decision as to who becomes the target of a disinformation campaign.

The New Populists’ Playbook

Right-wing networks are also increasingly using disinformation to advance their political agendas and sow distrust of the government. This is contrast to Russian, Chinese or Iranian efforts, whose disinformation campaigns are coordinated by the state. The fact that these individuals are citizens of the Western democracies where they live create difficulties when it comes to stopping their activities, as freedom of speech rightly remains an important principle and foundation of democracy. Examples from the recent German federal election show that disinformation spread, for example, via the Russian state media outlet Russia Today or other networks associated with the Russian government is shared on social media predominantly by right-extremist networks and the right-populist party "Alternative for Germany."[12] These narratives find an audience by touching on the feelings of their political followers, namely a lack of trust in the government, fear of state overreach, and even a disregard for democracy itself. Of particular concern is when right-extremist networks in both Germany and the United States actively spread disinformation about the fundamental legitimacy of elections.[13] In the case of the U.S., such narratives were fueled by former President Trump and the Republican Party, leading to the devastating events of January 6. Already, we can see these tactics spreading to other countries, such as Brazil, where national elections will be held in 2022. Recent reports have shown that Trump allies are actively helping President Bolsonaro spread disinformation about the legitimacy of the upcoming elections in Brazil.[14] Using tactics similar to President Trump's, they are defaming the opposition as criminals, undermining trust in the government, and creating alternatives to mainstream social media platforms. Most importantly, both Bolsonaro and Trump are abusing the intense polarization of society in both countries to advance their political interests. Divided societies are a perfect breeding ground for such disinformation to take root in people's minds, and it seems that both politicians are laying the groundwork for a challenge to the next democratic elections.

What does election interference look like in action?

Attempts at foreign election interference have not just been around since the 2016 U.S. election; they were already evident 15 years ago, and they have changed significantly since then. While it was once easier to trace the origins of interference, today's attempts at foreign meddling are much more effectively concealed. However, some preventative measures are possible, through a systematic comparison of specific influence operations from past years, to identify similar patterns and emerging trends that should help other states prepare safeguards for their upcoming elections.

United States 2016 and 2020

In 2016, Russia heavily interfered in the U.S. presidential elections through a coordinated series of cyberattacks and a massive disinformation campaign on social media platforms targeting the American public. According to the U.S. Intelligence Community and the Mueller Report, it is beyond doubt that the election interference was ordered by the leadership of the Russian government with the presumed objective of bolstering the chances of Donald Trump’s victory.[15] The cyberattacks, conducted by the Russian military intelligence service GRU, specifically focused on hacking into the servers of several entities associated with the Democratic Party, for example the email accounts of John Podesta, chair of Hillary Clinton’s presidential campaign, the Democratic Congressional Campaign Committee (DCCC) and the Democratic National Committee (DNC). The stolen data was subsequently released over social media and specific media outlets, such as Wikileaks, thereby undermining the trust of the electorate in the legitimacy of the election, and damaging the voters' perception of Hillary Clinton. Ultimately, Donald Trump also owes his election victory to these externally coordinated influence and disruption operations. Many observers – and perhaps even the masterminds of the foreign influence activities – could hardly have imagined at the beginning of Trump’s term that he himself would in turn become the master gardener tending these malicious seeds over the following four years.

Given the above, it should come as no surprise that the 2020 presidential election, unlike the 2016 U.S. election, was marked by a sharp increase in domestic or homegrown disinformation disseminated across the U.S. media ecosystem. According to the Washington Post's Fact Checker column, President Trump made 29,508 misleading or false claims between his inauguration and November 5, 2020. This fact significantly changed the character of disinformation in the U.S. from 2017 to the present. Moreover, in an insidious twist, the newly elected president and his immediate entourage invoked the term "misinformation" as a method of attack from the beginning of his term, using it to disqualify press outlets, newspapers and TV stations he disliked as "fake news." In so doing, Trump adopted one of the main tactics of the Election Interference Playbook, in which he publicly questioned the credibility of traditional media.

This onslaught of disinformation was particularly challenging, coming from a domestic source. When foreign actors deliberately influence public debate in a country through the massive dissemination of disinformation, the government and media can more easily counter the spreading narratives by actively attributing the campaign to these actors or by working with social media platforms to block accounts associated with the respective perpetrators. Increased awareness and understanding of the actors who typically attempt to interfere in an election can also increase the public's resilience to such disinformation, as the electorate is now aware of the possibility of such threats. However, when disinformation is spread by national actors or, in the case of the 2020 U.S. presidential election, by the president himself as well as parts of the entire executive branch, it is much more difficult to mitigate its spread. Moreover, from the beginning of his term to the end, President Trump was supported by high-reach media outlets such as Fox News and One America News Network, which had a great influence on the messages disseminated to his electorate. Finally, since the majority of elected Republicans also spread the disinformation that the election was stolen, this created the perfect storm in which voters who supported President Trump had little choice but to believe the "big lie" narrative. In a sense, the 2020 election was thus the culmination of a years-long Russian disinformation campaign aimed primarily at sowing massive distrust in the American public about its own government and democracy. Trump's election in 2016 may have been a welcome side effect, although whether Trump's presidency actually benefited Russia geopolitically remains unclear. But the overall goal of the 2016 election interference campaign, namely to divide U.S. society to the point of internal destruction, was indeed achieved as evidenced by the storming of the U.S. Capitol on January 6, 2021. This historic event unquestionably confirmed the thesis of the authors, as well as many other analysts, that the overarching objective of foreign election interference is not exclusively to disrupt the election itself, but to divide society and undermine democracy.

It is worth highlighting that Donald Trump's false narrative of the stolen election, as well as many other false messages, have continued spreading to this day, although the way social media platforms have countered the spread of disinformation since 2020 has changed drastically compared to 2016. In 2016, Russia benefited from the lack of countermeasures and tactics that social media platforms could have employed to prevent massive abuse of their platforms. By 2020, however, all major networks were aware of the modus operandi of spreading false news via social media, so appropriate countermeasures were taken. These included deleting and restricting accounts, labeling content for its truthfulness (especially in relation to Covid-19), and providing accurate information about the elections to users of the platforms. Nevertheless, disinformation continued to spread and led to the devastating events in the U.S. capital on January 6, 2021. In her testimony before the U.S. Congress, former product manager in Facebook's Civic Integrity Team and current whistleblower Frances Haugen explained that the measures taken by Facebook to combat disinformation had already been relaxed again after the November elections, which obviously contributed to President Trump's ability to spread his false narratives on a massive scale in the following weeks. In short, combating disinformation is a 24/7, 365-day endeavor that must be actively pursued outside of the election cycle.

United Kingdom 2016

Foreign interference in the 2016 British Brexit referendum by Russian state actors remains unproven. However, many analysts and experts, citing relevant evidence, are firmly convinced that Russia attempted to help influence public opinion in the UK in order to swing the Brexit vote in favor of the "Leave" campaign. At its heart was a massive disinformation operation, which at the same time made it harder to measure the effect of foreign influence. In 2020, the UK Parliament published a non-partisan report assessing possible interference in the Brexit referendum, but it had to be left open whether Russia was actually involved in such an operation. The background to this was the fact that the British government had never launched an official investigation into the allegations. Since such an investigation could have indirectly called into question the legitimacy of the election victory in principle it was never carried out.

France 2017

Just days before the 2017 French presidential election, several gigabytes of data were stolen from Emmanuel Macron's campaign headquarters, including nearly 20,000 emails. Together with a widespread disinformation campaign, this hack formed the basic framework for the Russian influence operation. But in the case of the French elections, it is important to note that the election influence operation does not appear to have been successful, especially since the target of the operation, Presidential candidate Emmanuel Macron, eventually won the election. On a broader level, therefore, the French case study serves as a blueprint for how to fend off malicious interference attempts: A less polarized, "healthy" and objective media environment in which sources and information are vetted, as well as anticipation and preparation for such events with clear contingency plans for responding to such hack-and-leak operations, are among the factors that can significantly mitigate the impact of large-scale election interference campaigns.

Germany 2015 and 2021

In 2015, the German Bundestag became the target of a massive cyberattack in which 16 GB of data was stolen, including from the computers of Chancellor Angela Merkel and numerous other members of parliament. Russia, in particular the intelligence service GRU, was quickly suspected as the perpetrator, but it was not until 2020 that the Federal Prosecutor General issued an arrest warrant for the first time against a suspected cyber spy from Russia for the hack.

The 2021 German election was also characterized by a sophisticated interference campaign, including at least three cyberattacks on parliamentarians at the federal and state levels, but also on the office of the Federal Election Commissioner and civil society organizations. In addition, the disinformation spread by Russian state media “RT DE” (formerly Russia Today Germany) was primarily directed against the Green Party and its candidate Annalena Baerbock, but also to a lesser extent against the Christian Democratic Union and its frontrunner Armin Laschet. These narratives referred in particular to Covid-19, the legitimacy of postal voting, and a major flood in the summer of 2021. As with other disinformation campaigns, the narratives disseminated by foreign actors picked up on divisive and polarizing issues in the target country's public debate. It is noteworthy, however, that the disruptive campaign did not appear to have any significant impact on the course of the election campaign or the election outcome.

Ukraine 2014

Long before the 2016 U.S. election, Russia's massive interference in the democratic elections of another sovereign state could be observed in Ukraine. In the final days before the 2014 national elections, a sophisticated, multi-layered cyberattack was carried out in an attempt to influence the outcome of the Ukrainian vote. The country's central election system was infiltrated, and the vote-counting system was rendered inoperable. Also, malicious software was installed, which ensured that the election results would make the ultra-nationalist candidate Dmytro Yarosh the winner. Finally, the website of the Central Election Commission was also shut down. Following the attack, Ukrainian authorities took steps to ensure that a free and fair election could take place, from which Poroshenko, a candidate critical of Russia, ultimately emerged victorious. Interference in Ukraine's elections is part of a broader strategy of hybrid warfare used by the Russian government against the country, culminating in the illegal annexation of Crimea.

Estonia 2007

In 2007, the websites of the Estonian Prime Minister, President and Parliament were repeatedly attacked over three weeks, resulting in limited functionality and disruption to the political system of the country, thereby depriving Estonia of its right to exercise its sovereign functions. While no definite attribution could be made, it is broadly assumed that the Russian Federation was the perpetrator, and that the attacks were executed in response to the plan of the Estonian government to remove a memorial statue, which had historical value for the Russian people. This case represents one of the first digital interference attempts by the Russian government against the integrity of a democratic state.

Conclusion & Outlook: The Digital Power Struggle and How Democracies Should Prepare For It

Most election interference playbooks – whether domestic or foreign – have one thing in common: They are not aimed exclusively at influencing specific election results. Their ultimate goal is to undermine and destroy citizens' long-term confidence in the legitimacy of their democratic institutions (government, parliament and political parties) as well as their democratic processes (elections). Consequently, the fact-based provision of information, strengthening the level of digital resilience of state institutions as well as the digital media literacy of their representatives and citizens should be a top priority for all democratic governments around the world. To support these efforts, we submit the following basic recommendations as a starting point for national discourses on how best to protect the integrity of each country's democracy.

Debunking and countering disinformation

Countering disinformation is essential in order to push back efforts of malign actors to undermine democracy. A multi-pronged and simultaneously implemented whole-of-government approach is needed in this regard. Regulating digital campaigning to ensure a free and fair election, regulating social media platforms to ensure that disinformation does not spread online, enhancing media literacy of citizens so they can adequately navigate the information jungle, and uniting democracies on a geopolitical level to deter malign actors and expose networks which seek to sow distrust in the integrity of elections – these are just some of the efforts needed to find long-term, holistic, and effective strategies against the spread of disinformation.

Enhancing media literacy

Enhancing the digital media and news literacy of citizens and policy makers alike indirectly reduces the effectiveness of false narratives and disinformation spreading through digital media, as individuals are now able to assess the veracity of a given piece of information. Where does the information come from, who published it, what was the intention of the publishers, are simple questions which can equip readers with the ability to discern true from false, disinformation from information. However, since the digital component of media literacy has not received sufficient attention to date, extensive investments must be made. The establishment of national digital media literacy authorities, the inclusion of digital media literacy in school curricula, but also targeted approaches for older people who have not grown up with social media are possible measures to improve the public's media literacy.

Regulating the digital space in democracies

Social media have become fundamental components of the global infrastructure of our information societies. Most of these companies are based in the U.S., but users around the world interact with each other, in numerous languages – this is the reality of the 21st-century global digital society. Opinions on how to regulate social media companies or the Internet currently differ widely, in some cases, following typical geopolitical tensions. And yet, all agree that abuse of freedom of expression, deliberate spread of disinformation, or even complete control of the Internet are major threats to democracies around the world. Legislative initiatives such as the Digital Services Act (DSA) presented by the European Commission and just adopted by the European Parliament represent a step in the right direction for holding social media companies accountable for possible election interference through their platforms. The new legislation constitutes a departure from the previous legal approach, according to which social media companies were not liable for content published by their users. Ultimately, however, the international discourse on preventing election interference that is currently in full swing among democracies around the world is part of a broader debate about how to deal with the steady rise of digital authoritarianism.

Enhancing cyber defense capabilities

Cyber disruption and cyber-attacks belong to the offensive tactics used in foreign election interference operations. The protection of elections and democratic institutions against such malicious interference by digital means is therefore of utmost importance for all democratic states. Elections over the past years have proven vulnerable to such cyberattacks, the most famous case being the hack of affiliated servers of the Democratic Party in the run-up to the US 2016 elections. In the context of a democratic election in the digital age, it becomes apparent that not only the critical infrastructure, such as the vote tallying and certification of results, is a potential target, but also political candidates themselves, for example as targets of ransomware attacks, malware-spam, or phishing operations. Enhanced technical capabilities, coordination among government institutions and the intelligence community, and oversight by the legislative branch are indispensable. Recent examples from Germany show a shift in strategy. Part of this new approach is to coordinate a collective effort of different authorities responsible for protecting against cyberattacks, including the Federal Office for the Protection of the Constitution, the Federal Intelligence Service, and the Military Counterintelligence Service. The Federal Foreign Office is the coordinating institution and possesses the authority to publicly attribute a respective cyberattack to a foreign government. In conjunction with publicly attributing attacks to respective aggressors, this strategy could be a new template for other democratic countries on how to respond and deter election interference in the future.


[1] Scheidt, M. (2019). The European Union versus External Disinformation Campaigns in the Midst of Information Warfare: Ready for the Battle? EU Diplomacy Paper, College of Europe, Bruges.

[3] EUvsDisinfo, "Figure of the Week: 100,000", 20.9.2021, https://bit.ly/3xe7fK0.

[4] Tagesschau, "Vermehrte Cyberattacken aus Russland," June 15, 2021, https://bit.ly/3CLzFvS.

[5] Special Counsel Robert S. Mueller, III, March 2019, “Report On The Investigation Into Russian Interference In The 2016 Presidential Election”, Volume I of II, p. 14; see: https://www.justice.gov/storage/report.pdf.

[6] BBC, "The disinformation tactics used by China", 12.03.2021, https://www.bbc.com/news/56364952.

[7] Ibid.

[8] Time, "Exclusive: Iran Steps up Efforts to Sow Discord Inside U.S.", June 9, 2021, https://time.com/6071615/iran-disinformation-united-states/.

[9] Ibid.

[10] Aljazeera, "US Seizes Three Dozen Websites used for Iranian Disinformation," June 23 , 2021, https://www.aljazeera.com/news/2021/6/23/us-seizes-three-dozen-websites…

[11] Ibid.

[12] Avaaz, "Deutschlands Desinformations-Dilemma 2021" September 6, 2021, https://secure.avaaz.org/campaign/de/bundestagswahl_2021/

[13] Institute for Strategic Dialogue, „Desinformationskampagnen gegen die Wahl: Befunde aus Sachsen-Anhalt," June 21, 2021, https://www.isdglobal.org/wp- content/uploads/2021/06/Bericht-Landtagswahlen- Sachsen-Anhalt-Final.pdf.

[14] New York Times, "The Bolsonaro-Trump Connection Threatening Brazil's Elections," November 11, 2021, https://www.nytimes.com/2021/11/11/world/americas/bolsonaro-trump-brazi….

[15] Office of the Director of National Intelligence, January 6, 2017, “Assessing Russian Activities and Intentions in Recent US Elections,” p. ii; see: https://www.dni.gov/files/documents/ICA_2017_01.pdf.


The opinions expressed in this text are solely that of the author/s and do not necessarily reflect the views of  the Heinrich Böll Stiftung Tel Aviv and/or its partners.