Facebook Accused Of Neglecting Covid-19 Misinformation In Europe

FILE -AP Photo / Amr Alfiky, file

Associated press

Facebook is said to respond to less than half of fact-verified misinformation in non-English European languages ​​- half as much as when the content is in English.

The Avaaz campaign group analyzed misinformation about Covid-19 released between December 7, 2020 and February 7, 2021 and verified by Facebook’s fact-checking partners or other reputable organizations. Material was selected that was deemed “false” or “misleading” and could cause public harm.

And it found that 56 percent of this misinformation in major non-English European languages ​​was not processed by Facebook, compared to just 26 percent of English-language content exposed by U.S. fact checkers.

“Facebook has a huge Europe-sized blind spot for Covid / Anti-Vax misinformation,” says Andy Legon, a senior global activist. “Just like the EU is facing a deadly third wave.”

According to the report, Italian speakers are the least protected from misinformation with no action taken on 69 percent of Italian content. Spanish speakers were best protected as only 33 percent of the misinformation in Spanish went unanswered.

On average, it took Facebook almost a week longer to flag non-English bogus content and 30 days to act, compared to 24 days for English-language bogus content.

The biggest misinformation topic was the side effects of the vaccination – including the claim that Bill Gates had warned of hundreds of thousands of deaths. Second came false claims about official measures or warnings, while third claimed that masks were either dangerous or useless.

While Facebook claims to use the same misinformation approaches regardless of language, Avaaz found that when translating posts into more than one language, the English version was more likely to be removed.

Avaaz urges the EU to do more to force Facebook to eradicate misinformation related to Covid-19 and vaccines in Europe.

“The current EU code of conduct on disinformation does not cover the errors identified in this report,” it said.

“So we urgently need a revised version that urges social media giants to disclose the amount of misinformation on their platforms and set clear targets for reducing it, monitored by an independent regulator.”

Vera Jourova, Vice President for Values ​​and Transparency at the European Commission, tweeted: “Despite improvements, FB and other platforms need to do more to ensure their policies are vigorously enforced around the world. We are working to revise the code of conduct against #disinformation.”

Comments are closed.