Disinformation works, and pro-Kremlin disinformation is often able to influence people’s opinions. This is the conclusion drawn from recent research on “online astroturfing”.
The researchers investigated the psychology behind effective mitigation strategies and the effectiveness of three “vaccines” against online disinformation.
But first, what is online astroturfing?
Astroturfing, a form of manipulation, is the craft of masking the sponsors of a message to make it appear as though it originates from and is supported by grassroots participants. It aims to give more credibility to statements or organizations. This trick is probably as old as politics. In Shakespeare’s Julius Caesar, Cassius writes forged letters from “the public” to encourage Brutus to assassinate Caesar. Like viruses, disinformation can be stopped through isolation. But obviously, that comes with a price open societies are reluctant to pay; economic downfall and frustration of the free flow of information. So it might not be surprising many hope a vaccine will be found that beats both COVID-19 and the infodemic. How does online inoculation work then? According to the study, it aims to inhibit or even prevent the impact of persuasive attacks by administering a weakened version of “the virus” to the audience. In general, effective vaccination consists of two elements. The first element is called “the threat”. The individual receives a warning about a pending persuasive attack that will challenge their existing attitudes.
So, what did the researchers find?
The bad news: online astroturfing works.
Why are these results important?
The experiment was limited to a specific form of disinformation. Further, the results are gathered in an experimental setting, not real life. Still, it could be that the psychological mechanisms at work here are similar to the mechanisms at work for those exposed to other forms of disinformation. That would suggest that only those counter-narratives that debunk the exact argumentation of disinformation work. And the counter-narrative has to be timely, probably right before the disinformation reaches its targets. These notions, although they require further validation, could prove valuable in strategic thinking on disinformation. A further question, however, is how specific a vaccine needs to be to achieve efficiency. Disinformation often uses recurring narratives and arguments. For example, pro-Kremlin outlets regularly employ the concept of “Cui Bono” – who benefits – virtually anytime Russian state operatives have been caught red-handed. This point has been used as regards MH17 and extensively on almost anything that is going on in Syria. Even when the Dutch police arrested a group of GRU officers attempting to hack the computers of the OPCW at The Hague, pro-Kremlin narratives tried to convince the audience that it was all a set-up to justify aggression against Russia.So, if audiences are inoculated against a certain forged narrative and are infected by a just slightly different narrative, how effective will the vaccine be then? It’s important to identify that variable. Valuable research has been done on this regarding deconstructing climate misinformation.
Read also:
- Secret labs and George Soros: Kremlin's COVID-19 disinfo
- Disinformation that can kill: coronavirus-related narratives of Russian propaganda
- Pro-Kremlin disinformation about the protests in Belarus
- EU watchdog for disinformation debunks 8000 cases in 6 years