Facebook is said to be acting on less than half of fact-checked misinformation in non-English European languages – half as much as when the content is in English.
Campaign group Avaaz analyzed misinformation about Covid-19 posted between December 7, 2020, and February 7, 2021 that was fact-checked by Facebook’s third-party fact-checking partners or other reputable organisations. It selected material that was rated ‘false’ or ‘misleading’ and could cause public harm.
And, it found, 56 per cent of this misinformation in major non-English European languages was not acted upon by Facebook, compared with only 26 per cent of English-language content debunked by US-based fact checkers.
“Facebook has a huge Europe-sized blindspot on Covid/anti-vax misinformation,” says senior glob al campaigner Andy Legon. “Just as the EU faces a deadly third wave.”
According to the report, Italian speakers are least protected from misinformation, with no measures taken for 69 per cent of Italian content. Spanish speakers were best protected, with only 33 per cent of Spanish language misinformation left unacted on.
On average, Facebook took almost a week longer to label non-English false content, taking 30 days to act, compared with 24 days for English-language false content.
The biggest misinformation theme was vaccination side effects – including the claim that Bill Gates had warned of hundreds of thousands of deaths. Second was false claims about official measures or warnings, while the third most popular claimed that masks were either dangerous or useless.
While Facebook says it uses the same approaches to misinformation whatever the language, Avaaz found that where posts were translated into more than one language, the English version was far more likely to be removed.
MORE FOR YOU
Avaaz is calling on the EU to do more to force Facebook to eradicate Covid-19- and vaccine-related misinformation in Europe.
“The current EU Code of Practice on Disinformation does not cover the failures identified in this report,” it says.
“That is why we urgently need a revised version that pushes social media giants to disclose the amount of misinformation on their platforms and set clear goals for its reduction, monitored by an independent regulator.”
And it’s looking set to get its way, with Vera Jourova, vice-president for values and transparency at the European Commission, tweeting: “Despite improvements, FB & other platforms must do more to ensure their policies are vigorously enforced across the globe. Hence we’re working to revamp Code of Practice against #disinformation.”