Foreign Affairs Subcommittee on Europe, Eurasia, Energy, and the Environment
“Russian Disinformation Attacks on Elections: Lessons from Europe”
July 16, 2019
Testimony by Jakub Kalenský, Senior Fellow at the Atlantic Council
Dear Chairman Keating, Ranking Member Kinzinger, Distinguished Members of the Subcommittee, thank you for the invitation to speak in front of you today, it is an honour.
I will try to describe what is the threat posed by the Kremlin’s disinformation campaigns in Europe with regard to influencing electoral processes, as well as the various solutions which have been undertaken to counter this threat. Let me state at the beginning that the image I will portray might sometimes look somewhat pessimistic. However, I firmly believe that the West, both Europe and the United States, has all the necessary tools and capabilities to successfully counter this threat. We in Europe just need to do much more, be much more robust, and much more determined in order to defend ourselves against the information aggression that the current regime in Moscow is conducting, and it is my sincere hope that through my testimony here today, I can contribute to ensuring that the United States does not repeat Europe’s mistakes.
Currently, I find the European response insufficient, and my fear is that because of this, the organizers of the disinformation campaigns are winning. In other words, the Western world is currently losing the information war that the Kremlin is waging, mostly because we in the West do not realize we are indeed in such a war, what this war has already cost us and what will it cost us in the future, and that we need to fight back to defend our values against an aggressor that is trying to undermine us.
I will try to describe why and against which threats we need to defend, and how it might be done.
The infrastructure of the disinformation ecosystem
The massive export of Kremlin disinformation began approximately with Russia’s invasion of Ukraine. There had been disinformation campaigns focused on audiences inside Russia before; and there had been some isolated disinformation incidents targeted outside of Russia; but the massive export of the Kremlin’s disinformation beyond Russia’s borders, in dozens of languages, using hundreds and thousands of channels, this “most amazing information warfare blitzkrieg we have ever seen in the history of information warfare,” to quote NATO’s top military commander General Philip Breedlove; that is new since 2014 and Russia’s annexation of Crimea and military aggression in eastern Ukraine. It has been ongoing since that moment, every day. Thus, the initial blitzkrieg has evolved into a sustained campaign of long-term aggression.
The confrontational approach in which information aggression is used, regardless of the peace-war status, is codified in many official documents of the Russian Federation. Theoretical articles by Russian military leadership, which discuss “leaking false data” and “destabilizing propaganda” as parts of their toolkit, also help to enshrine information aggression in Russia’s geopolitical strategy.
This attitude is publicly pushed by the very top echelons of the Kremlin. Vladimir Putin’s spokesperson, Dmitry Peskov, stated on Russian TV, “we are in a state of an information war. (…) First of all with the Anglo-Saxons.”
The pseudo-journalists dutifully serving the regime go along with this agenda. Margarita Simonyan, the head of the Russia Today (RT) TV channel, describes her network as an “information weapon,” a parallel to the Ministry of Defense. The Kremlin’s chief propagandist Dmitry Kiselyov (in 2019, still the only Russian pseudo-journalist on the EU’s sanctions list) has been even more explicit: “Today, it is much more costly to kill one enemy soldier than during World War II, World War I, or in the Middle Ages. (…) [But] if you can persuade a person, you don’t need to kill him.” Russia is probably the only country in the world where the regime’s “journalists” justify their job as a less costly alternative to killing people.
Their subordinates follow these instructions and, day after day, keep spreading lies. No matter how many facts are presented about the Russian invasion of Ukraine; Russian war crimes in Syria; the murder of nearly three hundred civilians on flight MH17 by Russian-made and Russian-operated weapons; Russian assassinations in Europe, like the one in Salisbury, England; state-sponsored doping in sports events; the Kremlin’s information operations and cyberattacks targeting elections all around the globe; or any other event that is of importance to the Kremlin, its disinformation ecosystem will continue lying, misleading audiences, and spreading disinformation stories and false counter-accusations.
And the Kremlin rewards these lies. Three hundred pseudo-journalists who were spreading false stories that there are no Russian troops in Crimea, and thus weakened and slowed down the Western reaction and, in effect, facilitated the annexation of the peninsula, received medals from President Putin for their “objective” coverage. Sixty Russian journalists received military awards for participating in the war in Syria. The Kremlin sees these “journalists” as part of the rank and file of its military.
The messages spread by the outlets directly controlled by the Kremlin are spread into other languages via local language versions of RT and Sputnik. These messages then merge into a much larger ecosystem consisting of various “alternative” media, which hide their affiliation to the Kremlin and pretend to be totally independent; in fact, they frequently parrot the same lies broadcast by Russian state media. Most recently, the Slovakian intelligence service identified the so-called alternative media as the most important tool for delivering propaganda campaigns that undermine the EU and NATO, spreading mistrust about official sources of information, and exacerbating divisions in their society. This is a pattern that can be observed in many other European countries.
This synergy between the Kremlin-controlled and Kremlin-influenced ecosystem is further amplified by well-organized operations on social media, by official Russian representatives, and unfortunately also by non-Russian actors that advance the Kremlin’s interests, either intentionally or unwittingly. These can be politicians, academics, journalists, or other influencers who spread Kremlin-originating disinformation for various reasons, including corruption, ignorance, or the simple need to attract attention or challenge authority. Several recent reports indicate that Russian disinformation operatives are now increasingly focusing on domestic actors who would spread Kremlin-originated disinformation for them, thereby laundering the information and blurring its source.
To give an example, reporting about the Ukrainian presidential election that took place in April, the New York Times wrote: “Unlike the 2016 interference in the United States, which centered on fake Facebook pages created by Russians in faraway St. Petersburg, the operation in Ukraine this year had a clever twist. It tried to circumvent Facebook’s new safeguards by paying Ukrainian citizens to give a Russian agent access to their personal pages.” It is the same tactic that the Soviets used during the legendary Operation Infektion, when they planted the disinformation that AIDS had been created by the CIA into an Indian newspaper to obscure the KGB origin of the disinformation.
The aim is to maximize the number of possible sources spreading the same disinformation messages as often as possible, in order to create an impression of seemingly independent sources confirming each other’s message. The repetition of the message leads to familiarity with the message, and the familiarity leads to acceptance.
The messages and the effect of the disinformation ecosystem
The strategic objective of the overall disinformation effort is very simple: to weaken and destabilize the West at every level. These levels include intergovernmental organizations, such as NATO and EU, individual states, regional administrations, governing coalitions, political parties, and all the way down to groups within society. Vladimir Putin is unable to make Russia more competitive on the global stage and weakening Russia’s adversaries is the only way the Kremlin can advance in a zero-sum game approach.
In this effort, the disinformers are spreading heavily polarized messages that trigger strong emotions and sow discord. They spread conspiracies that undermine trust in reliable sources of information; support radical and anti-Western elements in the targeted societies; promote anti-Western, anti-liberal, and anti-democratic politicians; and denigrate politicians who defend Western, liberal, and democratic values, because democracy and the rule of law threaten the survival of the current regime in the Kremlin.
The disinformation campaign must also protect itself, which often leads to the disinformers attacking those who uncover their information aggression and raise awareness about it, whether they are journalists, NGOs, civil servants, or politicians.
The precise content of the messages varies with the audience that is targeted. The disinformers have different disinformation messages for people in the east of Europe, where you can read about necrophilia being an accepted norm in the EU, and different messages for the people in Western Europe, where nobody would believe inventions about widespread sexual perversion in Western Europe but could believe that Ukrainians are Nazis just because they wear the Ukrainian national symbol.
Often, there are differences even within one country, because different socioeconomic groups have different sensitivity to various topics. It is easier to stoke irrational fear of migrants in the mind of a lonely pensioner living in the countryside than to do so with a diplomat living in the capital. Similarly, the tools and channels used to deliver the disinformation to an audience will be different, and social media is not always the most important channel.
The aim is to find those topics that stimulate the strongest emotions, as an audience driven by strong emotions will become irrational and more vulnerable to disinformation. Therefore, the disinformation machine focuses on the most polarizing topics such as immigration, LGBTQ issues, and the grievances of and prejudices against national and racial minorities. As the Czechoslovak defector Ladislav Bittman wrote, disinformers are akin to an evil doctor, making a precise diagnosis of the maladies afflicting their “patients” – but then trying to make their weaknesses and illnesses worse.
The disinformation campaign also spreads wild accusations targeted at individuals, organizations, and states that the Kremlin perceives as adversaries. Nordic countries are accused of genocide against Russian children. The French, Americans, Belgians, Germans, British, Ukrainians, and all of Europe are accused of conducting terror attacks against its own citizens. The Baltic countries, Germany, the United States, and Europe are regularly accused of Nazism. The presenter who won the most prestigious Russian TV award for “best educational program” is the same man who is behind other programs that claim Europe is a kingdom of gays who are trying to break children’s psyche and force them to change sexes.
The United States is frequently demonized as a fascist dictatorship, a power occupying or controlling Europe, orchestrating color revolutions all over the world, a warmonger unleashing conflict practically on a weekly basis, and an existential threat to Russia.
With Ukraine, it is the United States that is the top target for the pro-Kremlin disinformation campaign. Damaging the image of these countries will have wider consequences. It will be harder to negotiate Ukraine’s accession to the EU or to NATO if we have big parts of our populations believing the Kremlin’s lies about this country. The derogatory campaign about the United States damages the American image in Europe, weakens Transatlantic relations, and makes it harder for the United States to further its interests in an environment that is manipulated into hostility.
Despite the absurdity of some of these messages, there is evidence that they gain traction among certain target audiences. Throughout Europe and also in North America, there are documented cases where local actors, including high-level politicians, have repeated, and thus further amplified, messages from the Kremlin’s disinformation ecosystem. Even in the Netherlands, the country that lost the most citizens in the MH17 flight, you find influencers repeating Kremlin-originated lies whitewashing the real culprit of this horrible tragedy, such as the far-right leader Thierry Baudet.
Some opinion polls show that a significant portion of the population can be vulnerable to a sustained disinformation campaign. According to one poll, 80 percent of Bulgarians did not believe that Moscow orchestrated the poisoning of the Skripals in Salisbury, England. According to another poll, half of the Czech population does not recognize that the claim that the EU is supposedly organizing illegal migration is a lie.
We need more opinion polls to show us the precise scope of the damage and the vulnerabilities that exist. If conducted regularly, polls could also show us whether the problem is getting better or worse, and whether our counter-measures are effective. Without measuring the damage and the impact of our efforts, we are just shooting in the dark. Unfortunately, I am not aware of any organization in the world that focuses systematically on such mapping.
Apart from manipulating public opinion, there are even more dramatic results of disinformation campaigns. The man who fired a rifle in a Washington restaurant in 2017 believed he was saving children from the pedophile conspiracy known as Pizzagate, a disinformation campaign amplified by Russian trolls. Earlier this year, a Czech pensioner was convicted of terrorism because he caused rail crashes that were intended to resemble jihadist attacks. The man was brainwashed by anti-Muslim propaganda that was amplified by groups that take pro-Kremlin stances, including extremist political parties, in the Czech Republic.
Importantly, these information operations go hand in hand with other influence operations and active measures: supporting the European far-left and far-right, supporting paramilitary and martial arts groups, recruiting fighters for the war in Ukraine from European countries, and aiding other similar activities. Again, all this is done in order to advance the main goal: weaken the West. Manipulating people with guns, or people who are prepared to use physical violence, and misleading them so that they believe they are under threat and have to use every means possible to defend themselves and their “in-group” – this is one of the more reliable ways to destabilize a society.
It is necessary to keep this larger background in mind in order to fully appraise information operations targeting democratic processes, including elections and referenda. Looking at only the last few weeks before elections is like looking at the last five minutes of a basketball game in which one side is already thirty points ahead; the game is already decided, and there is not much that is relevant to be seen in the last five minutes.
If there is a long, ongoing, well-targeted disinformation campaign focusing, for example, on migration, which in some cases has spread lies like the one that migrants have made nine Italian nuns pregnant, this shapes the information environment in a way that helps political actors who are using the fear of migration in their own campaigns. This was the case in Italy, where the Russian state media outlets Sputnik and RT boosted their anti-immigration content a full year before the 2018 parliamentary elections, with messages like, “in 2065, quota immigrants in Italy could exceed 40% of the total population,” as shown by research done by Alto Analytics. The disinformation campaign that potentially influences the outcome of a particular election is not an isolated event; it goes on for months or even years before the election itself, to sow the seeds of vulnerability.
Various researchers and journalists have identified pro-Kremlin disinformation campaigns (to a greater or lesser degree) in the following elections and referenda:
- Scottish independence referendum in 2014
- Ukrainian elections in 2014
- Bulgarian local elections in 2015
- Dutch referendum about the Association Agreement between the EU and Ukraine 2016
- Brexit referendum in 2016
- Austrian presidential elections in 2016
- Italian constitutional referendum in 2016
- French elections in 2017
- German elections in 2017
- Catalan referendum in 2017
- Czech presidential elections in 2018
- Italian parliamentary elections in 2018
- Macedonian name referendum in 2018, and the Russian activities connected to that in Greece
- Ukrainian presidential elections in 2019
- Slovakian presidential elections in 2019
- European parliament elections in 2019
I cannot guarantee that this list is exhaustive. But just from this brief overview, we can see that pro-Kremlin disinformation activity is definitely not becoming less aggressive.
Apart from the effect on public opinion that could be quantified if measured properly, there are two more effects that worry me.
The first one is that the disinformers are gaining new knowledge about our audiences every single day. They gain new knowledge about who buys into disinformation messaging, who advocates it, and who spreads it. From this point of view, they already know our audiences better than we know them. The disinformers have also built a robust infrastructure for spreading disinformation, enhancing it and cultivating it on a daily basis. They are able to regularly identify new channels and new individuals who will spread disinformation for them. This infrastructure and this experience can be used for any purpose in the future, and I am afraid we are not prepared for such a reality.
The second is that the longer the disinformation campaign is in effect, the more people begin to perceive it as the new normal. We saw this effect around the recent EU Parliament elections—some observers even claimed that Russian disinformation is now in retreat. The only relevant data from the European Union’s East StratCom Task Force show that rather the opposite is the case (the number of disinformation cases the team identified this year doubled compared to the same period in 2018). But some European observers have already gotten so used to the previous level of Russian disinformation that they are already not perceiving it as something strange. I find this trend very worrying and I am sure that the Kremlin is quite happy about it since it whitewashes their aggression.
What we see from the Russian side is a very clear understanding of the battlefield they have chosen, what their goals are, and a very high level of determination to achieve them. Unfortunately, this level of clarity does not exist in most Western democracies.
Defense and countermeasures
There are various approaches used to counter the Kremlin’s hostile information operations in almost every European country. Among them, I see four basic lines of effort:
- documenting the threat, getting a better understanding of what is happening,
- raising awareness – which means exposing the threat, communicating it to audiences in order to educate them, inoculating them to some degree, and attracting other actors who can join in the effort and help educate new audiences,
- mitigating the weaknesses that the aggressor exploits,
- challenging the aggressors and punishing them by making them pay a serious price for their efforts to undermine our societies. This is perhaps the most sensitive area, and the most frequently overlooked, but, unlike all the others, it may provide the best chance to actually stop the information aggression.
In each of these four lines of effort, multiple tactical measures can be undertaken. Some of them are better undertaken by governments, which can commit significant amounts of funding, focus on a topic even if the media loses interest, and coordinate actions nationally. Other efforts are better undertaken by civil society, which does not operate under the constraints of government and can more nimbly and aggressively communicate with a respective audience. Some efforts are best undertaken by the media, while others are best done by private businesses, including the social media platforms. And, obviously, different societies will pursue different approaches, because countries have different legal environments, differing sensitivity to the topic, different levels of media literacy, and so on.
It is necessary to pursue all of these lines of effort, ideally at the same time and in a coordinated way. Picking just some of these solutions and ignoring the others is unlikely to result in success.
- Documenting the threat and gaining better understanding
This is a task that the team in Brussels, where I used to work, focuses on. The EU’s East StratCom Task Force collects and documents cases of pro-Kremlin disinformation from Russian state media and other outlets in the pro-Kremlin ecosystem. In various European countries, different departments and agencies are concerned with this threat – typically the intelligence services, but also various StratCom teams, which may be located in Foreign Ministries, Defense Ministries, or Interior Ministries.
Documenting the threat is a necessary first step without which it is close to impossible to do anything else properly.
The ideal result of this activity is to learn how many channels are spreading disinformation, how many messages per day they spread, how many people they target, and how many people they persuade. For that, we would need to have an extraordinarily robust monitoring structure for both traditional and new media, and we would need to conduct regular opinion polls measuring the appeal of the disinformation messages.
Having a proper monitoring system that spots disinformation messages in real time would also enable us to build an early warning system for newly emerging disinformation attacks.
So far, the EU still does not have answers to questions such as how many disinformation channels there are and how many messages they spread. Put simply, there are not enough resources for such comprehensive monitoring.
Comprehensive monitoring tasks are probably best done by a governmental body or a government-funded agency. Private companies do not have the necessary funding or reach, and this task is closely connected to security concerns.
- Raising the level of awareness about the threat
This is also something that the East StratCom Task Force is involved in: publishing materials on pro-Kremlin disinformation, delivering speeches at conferences, conducting training for governments, and briefing journalists and other critical audiences. The NATO’s StratCom Center of Excellence in Riga is involved in similar activities.
There are several bodies in Europe that are active in this area. One of the best examples is in Sweden, where the Security Service and the Civil Contingencies Agency have educated politicians, media, and other actors in the national system about the problem of hostile disinformation. The Czech Center Against Terrorism and Hybrid Threats also focuses on raising awareness within its government.
Another example is the Lithuanian Armed Forces StratCom Department. Some of you might have heard about the Lisa case in Germany, in which, at the beginning of 2016, thousands of people protested in the streets against Angela Merkel’s refugee policy. The story, a lie amplified by Kremlin media and officials, involved a young girl who falsely claimed to have been raped by men who appeared to be immigrants. A similar disinformation story surfaced in Lithuania a year later, but it received close to no traction because the authorities were properly trained for such situations and because the StratCom team anticipated the situation, countering the false claim swiftly. They warned stakeholders about the disinformation claim when it first appeared and, as a result, the first story in the media was not that a little girl had allegedly been raped, but that there had been another disinformation attack on Lithuania. This case is a brilliant example of neutralizing disinformation before it has time to spread.
However, for this to occur, a very high level of awareness is needed, plus excellent, real-time monitoring of the information space and expert knowledge about potential disinformation claims. All of this does not occur overnight. For others to improve in this area, they must work diligently to raise and maintain a high level of awareness over a long period of time in order to get to a point where their system can respond in the exemplary way in which the Lithuanians did.
I am afraid that the EU as a whole does not have a level of awareness as high as that of some of its Member States, such as Lithuania. It would be necessary to have a much larger and better-resourced campaign than is possible under current circumstances. Current EU communications on this issue, including those from the East StratCom Task Force, often offer excellent quality content, but do not yet have sufficient reach.
It is also necessary to bear in mind that it is not enough to focus just on government efforts. It is also important to engage with other audiences that are critical in combating disinformation, including politicians, journalists, and academics.
It is crucial to look for actors outside government because not everyone trusts what governments say. We need other opinion leaders to act as trusted messengers on these issues to their own audiences, which government often cannot reach.
A good example is, again, Lithuania, where there is a news-comedy program, similar to Last Week Tonight with John Oliver, that makes fun of Russian propaganda. Or a young and influential Czech Youtuber who educates hundreds of thousands of his followers about media literacy and fake news. These actors can address audiences that governments and other men in suits hardly reach. It is necessary to raise awareness of this problem very broadly throughout society, and for that, a variety of actors is needed.
However, governments can also support these other actors who are doing similar work by supporting quality media and independent journalists covering these topics, as well as NGOs who are working in this area. It is worrying to hear the European anti-disinformation community complain about lack of funding for their activities.
And it is important to focus not only on one’s own country, but also to raise awareness about what is happening elsewhere. As mentioned in the recent Atlantic Council report on the disinformation attacks surrounding the 2017 elections in France, apart from structural reasons and luck, a big role in the successful defense against the disinformation was learning from others, which raised awareness about what might happen and permitted pre-planning for contingencies. Thus, when a hack-and-leak operation similar to the one in the United States in 2016 appeared on the eve of the election, the Macron campaign was very well prepared.
This case also reminds us about the adaptability of the disinformers. While the hack-and-leak operation succeeded in the United States, it failed in France. Therefore, the disinformers did not try the same strategy in Germany in the 2017 Bundestag election, despite the fact that Russia already had the hacked content that could be used for such a purpose.
- Repairing the systemic weaknesses, building up our defense
Kremlin disinformation rarely seeks to create new divisions and weaknesses in society; instead, it exploits divisions and weaknesses that already exist. Trying to mitigate these weaknesses is one of the ways to make our societies less vulnerable.
A big part of building up our defenses is done by raising awareness of the threat. In order to solve a problem, you have to know about it. Therefore, a good communication campaign about the threat posed by disinformation can be a very good first step to repair some of the weaknesses in our societies and information systems.
However, we cannot rely only on communication experts. Structural weaknesses require the involvement of more specialized professionals.
“Media literacy” is often mentioned as a way to protect against disinformation. In the Nordic countries, which are frequently cited as examples of highly media-literate societies, the local versions of Russian disinformation outlet Sputnik had to shut down in a fairly short amount of time because they did not attract enough readers. In particular, Finland is often cited as one of the best examples of a highly media-literate society resisting fake news.
However, the results that can be expected by improving media literary are often over-emphasized. First, this will require concerted campaigns across numerous educational systems that will have to last for decades to have the desired effect. This is a massive effort with very long lead times. And we need solutions more urgently.
In addition, what we have observed in Europe is that if the information aggressors cannot exploit one weakness, they simply move on to exploit different ones. In the case of Nordic states, this could be cyberattacks or online trolling. The worst case of personalized online bullying that we know about in the entire EU was against the Finnish journalist Jessikka Aro, who was exposing Kremlin influence operations in Finland. This case has already had criminal consequences.
Trying to raise the level of media literacy in any society is certainly something that can only help to counter disinformation. But it is a very long-term task, which should not be undertaken by communication teams, but by education experts, academia, and education ministries.
Another weakness to be mitigated is the social media environment. While not the only channel responsible for the dissemination of disinformation, social media platforms allow disinformation to spread virally. Social media platforms, however, cannot solve the entire problem of online disinformation. They do not produce the malicious content; they just are used and abused to spread it. Social media may be a very powerful weapon, but the platforms are not the ones pulling the trigger.
Social media platforms can be pushed to de-rank and clearly label content from outlets notorious for spreading disinformation. A similar approach was used against tobacco – the smoker can still smoke, he is just clearly warned that he is using a harmful substance. Similarly, disinformation outlets could be labelled as harmful to one’s mental health.
If social media executives plead that their platforms are not there to determine which information is true and false, that is an excuse. If the companies do not know how to identify disinformation, they can ask the multiple organizations that have been working on this topic in recent years. In 2019, it is an inexcusable shame that some of the platforms still recommend known disinformation content at some of the highest positions in their search results.
However, I would agree with a recent statement by Facebook CEO Mark Zuckerberg, that the social media companies cannot handle this crisis on their own and that they should not have the final word. They cannot force the information aggressors to stop their aggression; that is already a task for someone else.
The European Union is working with the industry through a voluntary Code of Practice on Disinformation. Several platforms like Google, Facebook, and Twitter have agreed to self-regulatory standards to fight disinformation. However, the EU Commissioners are still not fully satisfied with the progress so far and have threatened a regulatory approach.
And, despite the fact that Facebook has closed down over two billion fake accounts, the number of disinformation cases identified by the East StratCom doubled in 2019 compared to the same period in 2018. That could indicate that the disinformers have adapted to the new environment, for example by using real people to spread disinformation instead of fake accounts.
The traditional media can also do more to fix their own weaknesses. Five years ago, Peter Pomerantsev and Michael Weiss proposed that a Disinformation Charter for media and bloggers be formulated in order to identify which behavior is acceptable and which is not. They also recommended that media outlets hire specialized disinformation editors who could prevent the media from becoming an inadvertent purveyor of disinformation. As far as I know, there has been no progress on this in the past five years.
Another action that needs to be taken consists of focusing on groups that are most vulnerable to disinformation campaigns. We need to know how many people are influenced by various disinformation campaigns, and who they are. Once we know this, we know where the biggest problems are. If we conclude that pensioners are spreading disinformation in part because they feel lonely, we can try to address this problem. If political parties notice that former high-level politicians crave their former recognition and acclaim—and the disinformers are often the first ones to exploit such cravings—we can try to engage retired politicians more and thereby mitigate their vulnerability to be exploited.
Another weakness that is often exploited are tensions among different socioeconomic groups: between the younger and older generation, urban and rural areas, higher and lower-income brackets, between the majority group and various religious, racial, national or sexual minorities. While overcoming these tensions should be part of a sensible policy in any society regardless of the danger of disinformation, it will also help to reduce vulnerabilities that disinformers can exploit. And the opposite holds: worsening these tensions and divisions will provide the disinformers with more fertile ground for their operations.
In the area of mitigating weaknesses that can be exploited by disinformation, almost every part of our society could do more: governments, NGOs, media, both traditional and new, politicians, influential opinion leaders and opinion makers, tech companies, academia and schools, etc.
- Punishing the aggressor
All three areas above are necessary, but they are not enough to stop information aggression. We can document information attacks, but that will not make them stop. We can try to prepare our populations for information attacks and do our best to mitigate weaknesses that can be exploited, but there will always be weaknesses and fissures in every society. In addition, disinformation, like a virus, mutates and adapts to new environments, and will always find new weaknesses and targets.
It is in the nature of the aggressor to be aggressive. If we want to stop aggression, we must punish it and do our best to dissuade any further incidents. This is not an appeal to create new rules or new laws. In many cases, we just need to use the already existing ones.
It is necessary to name and shame those who are part of pro-Kremlin disinformation campaigns, either wittingly or unwittingly. It should not be perceived as normal or acceptable to repeat Kremlin lies about Ukraine, Syria, MH17, or about Russia being the supposed protector of traditional values against the decaying West. Individuals who are helping the Kremlin to spread these lies should be named and shamed – by the media, politicians, NGOs, academics, and others. Some European NGOs are doing this, but, unfortunately, this is not a usual part of mainstream media reporting, and it is almost never done by governments or civil servants.
The most aggressive and most visible propagandists should be sanctioned. It is a shame that, to this day, it is only Dmitry Kiselyov, who is something like Vladimir Putin’s Joseph Goebbels, who has been sanctioned by the EU. Another pet journalist of Putin, Vladimir Solovyov, uses his show to spread hatred against the West several times a week, yet freely enjoys the pleasures of luxurious villas at Lago di Como in Italy. And there are dozens more who deserve to be on the sanctions list. Punishing the most visible propagandists and periodically adding new individuals who participate in Kremlin disinformation would send a clear signal that the West does not tolerate the spreading of lies and hatred. Those who propagate lies and hatred about our world in order to break it down simply should not enjoy all the benefits our system and our values offer.
Similarly, Western companies should pull their ads from disinformation outlets, both in Russia and in Russian media publishing abroad. It is mind-blowing to see Western companies among the top advertisers on Russian TV. A quote ascribed to Vladimir Lenin said, “The capitalists will sell us the rope with which we will hang them.” Those Western companies that are buying advertising time in Russian media that is used as a weapon against the West are doing exactly that. Sanctioning not only the individuals but also the companies involved in spreading disinformation could help to achieve that goal.
Western countries and politicians should limit access to disinformation-oriented outlets and cut them off, with no accreditation, no access to press conferences, no statements for them, and no answers to their questions. These restrictions would make it clear that they are not media, as they themselves admit, but weapons in an information war, as noted above. Estonia made the correct decision not to allow Russian pseudo-reporters to cover an EU foreign ministers’ meeting in 2017, and I find it horrible that the OSCE and the European Federation of Journalists reproached the Estonians for this. This is the equivalent of professional medical doctors defending the right of quacks and charlatans to harm people with bogus treatments.
Fortunately, the UK Foreign Office followed Estonia’s example very recently and also refused accreditation to RT and Sputnik, thereby effectively banning them from attending a conference on media freedom. European countries should also be inspired by the US example and have such media register as foreign agents.
In many countries, it is possible to use existing laws and regulations to force pro-Kremlin pseudo-media to adhere to industry standard. In 2016, Lithuanian authorities punished a Russian TV channel for inciting hatred based on nationality. At the beginning of this year, a Latvian broadcast regulator temporarily restricted a Russian TV channel because of hate speech and incitement of war. This May, Lithuania kicked out the head of Lithuanian Sputnik since he is considered a threat to national security. Britain’s media regulator, Ofcom, has punished RT several times already, primarily for not upholding media impartiality. The pro-Kremlin disinformation ecosystem regularly spreads lies, defamation, false accusations, and false alarms—I believe there are many cases when they might violate the laws or regulations of different countries.
In order to be able to identify those who deserve to be punished, it is also necessary to conduct official investigations, similar to the one conducted by Special Counsel Robert Mueller. This is an area where the United States is far ahead of Europe—the Americans are investigating the attack on their democracy, and a proper investigation is the necessary prelude to a just punishment. Despite the long list of European elections and referenda that have been targeted by Kremlin disinformation in the past five years, I am not aware of a single similar investigation in Europe. We Europeans are basically saying that we do not care whether someone attacks our democracy, we will not react. As a result, logically, we thereby invite further aggression.
Punishing the information aggressors will have one more desirable effect: it will deter other potential aggressors. We already see that other state and non-state actors are adopting the Kremlin’s playbook, apparently because they have calculated that the weak reaction of Western societies is nothing that would deter them. According to some reports, it is especially China that is active in this regard.
A resolute punishment of the number one information criminal would send a clear signal to other potential criminals.
Just recently, a white paper prepared in the Pentagon warned that the United States is still underestimating the scope of Russia’s aggression and the danger posed by their influence operations. I am afraid much the same could be said about Europe and most of its countries.
On the other hand, the Western countries have all the necessary tools in order to win this fight; we are just not using them. Russia is currently besting us only because of its ruthless determination and total lack of morals. They act, while we engage in seemingly endless discussions about whether we should act, how, and toward whom. It does not have to be this way. If we decide we want to win this fight, we will win it. It is only a matter of political will, not knowledge or capabilities.
 “Our main objective was to note and dissect all the enemy’s weaknesses and sensitive or vulnerable spots and to analyze his failures and mistakes in order to exploit them. The formulation of special operations might remind one of a doctor who, in treating the patient entrusted to his care, prolongs his illness and speeds him to an early grave instead of curing him.” Ladislav Bittman, The Deception Game, 1972 (p. 124)
and the cases related to the US in the EUvsDisinfo database: https://euvsdisinfo.eu/disinformation-cases/?text=US&disinfo_issue=&date=
 https://www.theatlantic.com/ideas/archive/2018/08/russia-is-co-opting-angry-young-men/568741/, https://www.dw.com/en/putins-secret-sleepers-waiting-for-a-signal/a-19196685 and https://reportermagazin.cz/a/pnscW/kdyz-vlastence-vzrusuje-valka
 The closest comparison was probably the UK Parliamentary investigation into disinformation and fake news, which does not have criminal consequences: https://www.parliament.uk/business/committees/committees-a-z/commons-select/digital-culture-media-and-sport-committee/news/fake-news-report-published-17-19/