Steps against online manipulation taken by 43 countries in two recent years


Hybrid War

Edited by: Yuri Zoria
Since 2016, at least 43 countries have proposed or implemented measures aimed at combating influence campaigns on social media.

This is according to a study by the NATO Strategic Communications Center of Excellence, whose authors note that new approaches to tackling disinformation are “flourishing” as online efforts to manipulate public opinion become an increasingly “pressing policy concern.”

The study, titled Government Responses to Malicious Use of Social Media,” breaks down the new regulations into 10 categories: content takedowns by social media platforms, transparency of online ads, data protection, criminalization of disinformation, expanding the definition of illegal content, media literacy and watchdogs, journalistic controls, parliamentary inquiries, creation of cybersecurity units, and monitoring initiatives.

Ireland, Italy, and Australia, for instance, are among the countries that introduced criminal penalties for producing or sharing disinformation, or for organizing a bot campaign targeting a political issue.

Among other measures, Croatia recently funded a new media literacy initiative, the US Congress is investigating Russian interference in the 2016 US presidential election, and G7 countries are developing a Rapid Response Mechanism to fight disinformation and foreign interference in elections.

The authors, however, caution that the countermeasures adopted over the past two years are often “fragmentary, heavy-handed, and ill-equipped” to curb harmful content online.

They point out that most of the government initiatives so far have focused chiefly on regulating free speech on social media rather than on addressing the deeper systemic problems that lie beneath attempts to influence public opinion online.

Some authoritarian governments, they say, have also co-opted the fight against disinformation to introduce legislation aimed at tightening their grip on the digital sphere and legitimizing censorship online.

Instead, the report urges policymakers to demand greater accountability and cooperation from social media platforms.

“A core issue is a lack of willingness of the social media platforms to engage in constructive dialogue as technology becomes more complex,” the authors note.

The report encourages governments to shift away from the measures aimed at controlling online content and work together to “develop global standards and best practices for data protection, algorithmic transparency, and ethic product design.”

The European Union has stepped up its own efforts to counter disinformation and in December presented an Action Plan aimed at tackling online disinformation in EU countries and beyond.

The Action Plan will also ensure that tech companies comply with the European Commission’s Code of Practice, a document that commits online platforms to increase transparency for political advertising and to reduce the number of fake accounts.

The platforms are required to report to the Commission on a monthly basis ahead of the European elections in May and face regulatory action if they fail to meet their commitments.

Further reading:

Edited by: Yuri Zoria

Ukraine needs independent journalism. And we need you. Join our community on Patreon and help us better connect Ukraine to the world. We’ll use your contribution to attract new authors, upgrade our website, and optimize its SEO. For as little as the cost of one cup of coffee a month, you can help build bridges between Ukraine and the rest of the world, plus become a co-creator and vote for topics we should cover next. Become a patron or see other ways to support. Become a Patron!

Tags: , , , , , ,