A new analysis of 32 Russia-linked Facebook and Instagram accounts – which have since been removed – reveals how the Internet Research Agency’s tactics are evolving to circumvent the platforms’ nascent defenses against manipulation.

Combining old and new to evade detection
Some of the IRA’s strategies and tactics were the same as before: impersonating Americans, including political groups and candidates, and attempting to sow partisan division by targeting both sides of the political spectrum with posts designed to incite outrage, fear, and hostility. Much of the activity was also aimed at discouraging certain groups from voting – a common tactic also used in other elections – and focused on swing states (namely Michigan, Wisconsin, Florida, Ohio, and Arizona). This voter suppression strategy notably used same-side candidate attacks, which aim to splinter the coalition of a given side, as well as the promotion of “third candidates” (e.g., Rep. Tulsi Gabbard). But the posts also show how the IRA’s methods are growing increasingly sophisticated and audacious. Its trolls have gotten better at impersonating candidates and parties, for example by mimicking the logos of official campaigns with greater precision. They have also moved away from creating their own fake advocacy groups to imitating and appropriating the names of actual American groups. Finally, they have increased their use of apparently apolitical and commercial content in an effort to obscure their attempts at political manipulation. These efforts to better imitate authentic user behavior appear designed to evade detection as platforms adopt greater transparency measures and defenses against manipulation, focusing on coordinated inauthentic behavior. Improvements in operational security also helped the IRA accounts appear less conspicuous.
Protecting elections is a matter of urgency
In light of these findings, as well as new revelations that the Internet Research Agency has been outsourcing its work to troll farms in Africa in pursuit of plausible deniability, it is clear that the challenges of safeguarding our electoral processes from the foreign intervention are more grave and urgent than ever. Ensuring electoral integrity in the face of these evolving manipulation tactics requires comprehensive regulatory solutions for digital political campaigning and distortive practices like astroturfing, as well as more stringent transparency requirements for online platforms. Such measures could include thorough cross-platform archives of political campaigns to help researchers and public authorities track digital political advertising and expose malign influence efforts before they can obtain traction.Read also:
- EU elections update: Russian propaganda’s long game
- Poll: 82% of Germans concerned that political disinformation able to impact elections
- Distraction as a tool of Russian propaganda
- Ten embarrassing moments in Russian disinformation of 2019
- Russia accountable for 72% of foreign disinformation operations
- Heavyweight megaphones of Russian disinformation in Eastern Ukraine