The EUvsDisinfo database contains plenty of telling examples of disinformation from pro-Kremlin YouTube channels. We did some maths and found out that from January to November, 212 such videos got 26 million views.
While we can never be sure if all of the views, likes, comments, or shares come from real people or are part of the manipulation, we have a research-based list of factors that make disinformation attractive to our brain. The TOP3 include threats, sex, and disgust. Based on our experience, threats are constant in pro-Kremlin audiovisual media. The main idea seems to be that Russia is surrounded by the evil, ever-threatening West that is not shy of historical revisionism and false accusations.
As for the effect of such disinformation, it’s worth looking at a poll from this autumn. Turns out that out of 1600 surveyed Russians 42% said they are constantly afraid of a world war. In this situation, it’s suitable to ask who might defend Russians from such a fate, but here you’re in for a surprise. After 20 years in office, only 39 percent of Russians trust Vladimir Putin and 24 percent of polled Russians could not name a single trustworthy person in Russian politics.
And just like politicians, the media is also losing believers. Only 1 in 3 Russians say they trust the news programs on state-controlled TV channels Pervyi Kanal and Rossiya 1.
Despite that – or maybe exactly because of that – the Russian federal budget draft for 2020 increased subsidies for state-owned media by one third, to 1.3 billion euros.
Amongst other things, those funds will be used to manipulate the way we see events in the world. A great example: RT’s YouTube channels published only 10 videos about election-related protests in Moscow and a hefty 1,973 videos (that’s 6,6 per day) about the Yellow Vests.
To make those videos fly even higher among audiences, a clever system of recommendations is abused. For example, it turned out that RT’s video about the Mueller report that calls journalists “Russiagate conspiracy theorists”, was recommended over 400 000 times on at least 236 YouTube channels.
Now, thanks to videos on YouTube, the pro-Kremlin outlets earned at least 6 million USD between 2017 and 2018.
But spreading the pro-Kremlin narratives is happening in a uniform way on many social media platforms. Analysis by De Groene Amsterdammer showed that in three days, the Internet Research Agency produced as many as 111,486 tweets and almost half of them focused on blaming Ukraine for downing the MH17 plane.
Twitter itself identified over 9 million tweets having links to the Russian Internet Research Agency. We had a look at those tweets and found 692 accounts mentioning MH17. They became active after the crash and every time a report about it was published. Often such accounts shared links containing disinformation.
As we were able to find 49 different language versions of Sputnik web portals all around the globe and 31 were also in our disinformation cases database, it’s pretty understandable why 82% of polled Germans are concerned that political disinformation campaigns can manipulate elections and 81% of respondents think that politically motivated disinformation is threatening democracy.
But can citizens identify manipulation of information? Well, a recent study from the US shows exactly the opposite: 90% of surveyed US high school students failed at least two-thirds of the digital literacy assessment. For example, 52% of students believed that a silent, grainy video claiming to show ballot-stuffing in the US during the 2016 Democratic primaries indicated “strong evidence” of voter fraud. In fact, the video was shot in Russia and a quick online search produces numerous articles debunking the lie.
But it only gets worse from here, as five Nordic and Baltic security services warned Europeans about the threat of pro-Kremlin disinformation, influence operations, and election meddling attempts.
The experience of private companies support it – Microsoft alone discovered 104 cyberattacks against several democratic institutions, think tanks, and non-profit organizations in Europe. Spear phishing campaigns designed to gain access to employee credentials and deliver malware originated from a group APT 28 believed to be associated with Russia’s military intelligence agency, the GRU.
But the most blatant example of election interference came from Madagascar. There at least 6 presidential candidates were offered money by Russians, a BBC investigation revealed.
At least 43 countries have taken similar interference threats seriously and proposed or implemented measures aimed at combating influence campaigns on social media since 2016.
As all those examples above prove, we have to continue the fight with disinformation to safeguard the democratic process and the Western values.