The operation was characterized by greater tactical sophistication aimed at circumventing Facebook’s monitoring capabilities and obscuring attribution – similar to the tactics observed in the Russian Internet Research Agency’s operation across eight African countries exposed last October. In particular, the individuals behind this latest network posed as locals and used fake accounts to manage several groups and pages, as well as post and comment on various content. Notably, much of this activity occurred throughout 2016 and 2017, and while some of the accounts in question were detected and disabled by Facebook’s automated systems, many remained active until last week’s takedown.
Some of the accounts claimed to be citizen journalists and used these fake personas to try to contact policymakers, journalists, and other prominent public figures in Ukraine and other countries to plant false stories about politically divisive issues. The accounts also posted content – usually in Russian, English, or Ukrainian – that focused on local news as well as major international topics like the downing of flight MH17, Russian military engagement in Syria, and the alleged leaks from Ukraine’s state security service (SBU) related to ethnic tensions in Crimea. Examples of this content provided by Facebook are consistent with pro-Kremlin disinformation narratives on these topics, for instance attempting to cast doubt on the findings of the Joint Investigation Team (JIT) that found Russia responsible for the MH17 crash, attacking politicians and public figures advocating for closer ties with the West, and attacking humanitarian groups working to document war crimes in Syria.
Lessons for the Future
The operation was examined by Graphika, a social media analytics company, prior to Facebook’s takedown of the accounts. This analysis highlights three main takeaways:
The ascendance of private messaging: beyond making public posts, these accounts engaged – and sometimes entrapped – their targets via private messaging. Such use of direct messages and emails to approach journalists and political figures has featured in several other information operations and is of growing tactical significance.
Coordinated activity across platforms: Graphika found that the Russian operation went beyond the Facebook accounts in question to a number of other smaller online platforms and blogs, which can be more easily leveraged to “launder” content and obscure its origins on social media.
Media organizations are targets too: agents of disinformation seek to legitimize their content by having it re-published by credible outlets to which they can outsource “the narrative baton.”
In addition to the Russian operation, Facebook also removed two further networks as part of this latest takedown: one originating in Iran that primarily targeted the US, and one originating in Myanmar and Vietnam that targeted audiences in the former. These operations were considerably smaller than the Russian effort: the Iranian network comprised six Facebook accounts and 5 Instagram accounts, while the Myanmar-oriented operation involved 13 Facebook accounts and 10 pages. Read more about Facebook’s prior shutdowns of Russian operations here and here.
- Facebook removed 364 propaganda pages and accounts of Russian origin
- Why Facebook is more revealing on nudity than on Russian disinformation
- Facebook takes down Ukrainian troll farm pages. Here is how they worked
- NATO CoE’s experiment: 54,000 inauthentic social media interactions for 300 euros
- How Zelenskyy “hacked” Ukraine’s elections
- Analysis: Among presidential candidates Poroshenko is the key target of hate speech on VK
- Facebook to counter Kremlin-style info interventions in election campaigns
- Russian troll farms behind campaign to topple Ukraine’s govt
- How Russian trolls are recruited, trained and deployed
- How pro-Kremlin think tanks spread propaganda in the West