Why Facebook is more revealing on nudity than on Russian disinformation

Why Facebook is more revealing on nudity than on Russian disinformation

 

More

Researchers, election observers, and governments alike all struggle to measure the scope and impact of Russian disinformation campaigns. So far, sporadic press releases by online platforms constitute the bulk of first-hand evidence of these campaigns, as is the case with Facebook’s latest takedown of Russian operations targeting African countries. More systematic access to this evidence will be key to tackling disinformation and election meddling.

Even though Facebook and the Russian government have their Brussels offices next door to each other, they appear to live in worlds apart.

Last week, Russia’s state-operated news agency RIA Novosti proclaimed that “Russia does not interfere in internal affairs of African States.” Just two days later, Facebook announced the takedown of a major disinformation campaign targeting at least eight African countries. On its news blog, the company explicitly attributed these campaigns to “entities associated with Russian financier Yevgeniy Prigozhin,” popularly known as Putin’s chef, who has previously been indicted by the US Justice Department for involvement in Russia’s interference in the 2016 US election. Indicating tactical evolution, Russian operatives also worked with locals in the African countries to set up Facebook accounts that were disguised as authentic to avoid detection.

But Facebook’s press release is interesting for another reason: for the first time on its news blog, the company explicitly refers to a Russian campaign as “foreign interference.”

So far, experts are divided about whether Russia’s disinformation campaigns constitute foreign interference. In line with international law, this would boil down to classifying the Kremlin’s operations as acts of coercion. Some say this is a stretch, given that Russian social media operations merely “impact people’s opinions, which may or may not have impacted subsequent votes.” Others are more hawkish in their assessment, noting that the Kremlin’s efforts indeed constitute coercion insofar as they are “purposively designed to exert control over a sovereign matter.”

Not only scholars of international law struggle to assess the scope and impact of disinformation campaigns. By refusing to grant systematic access to public interest data, the leading online platforms currently monopolize the ability to assess whether elections may have been compromised by manipulative campaigns.

For the EU, for instance, election observation missions are a key tool for supporting democracy and promoting human rights around the world, including in African countries. Yet with these companies failing to systematically provide evidence of malicious activity on their platforms, it is virtually impossible to assess the degree to which the Kremlin’s operations may have distorted electoral processes in African countries, or violated national or international electoral laws. Sporadic updates on the news blogs of online platforms are an insufficient basis for democratic actors to do their job.

As The Guardian recently noted, less than 10% of Facebook users live in the US, arguing that in order to protect the remaining 90% from harm, the company should tailor its transparency and integrity policies to respective social and political contexts. In countries that are targeted by Russian disinformation campaigns, this also implies reporting regularly and comprehensively on the degree of information operations via the different Facebook services.

Currently, Facebook systematically reports on violations of its adult nudity policy, but nowhere discloses the full volume and extent of foreign interference campaigns on its platform. This means that when platforms take down malicious networks emanating from Russia or Iran, for instance, they don’t do so on grounds of foreign interference, but based on other provisions of their terms of service, such as those relating to fake accounts or so-called “inauthentic coordinated behavior.”

Besides systematic self-reporting, researchers have also called for insights into accounts that platforms themselves have taken down and attributed to foreign actors.  This would help researchers identify behavioral patterns, and thus detect future disinformation campaigns faster.

Advertisement


Further reading:

Edited by: Yuri Zoria

Source: EU vs Disinfo

Dear readers! We need your help. COVID-19 has hit independent media outlets hard, but even more so in Ukraine, where most outlets are controlled by oligarchs. To make matters worse, several English-language media sources from Ukraine have closed recently. And even worse, this comes at a time of troubling government tendencies and amid a pro-Russian resurgence in Ukraine.  Help keep us online and reporting on the most important of Ukrainian issues for you in these troubling times, bringing the voices of civic society to the forefront of the information war. Our articles are free for everyone to use but we depend on our readers to keep going.  We are a small independent journalist team on a shoestring budget and have no political or state affiliation. If you like what you see, please support us with a donation

Tags: , ,