Future Tech

Facebook’s Biden defence lacks key data on spread of Covid lies

Tan KW
Publish date: Wed, 21 Jul 2021, 12:01 PM
Tan KW
0 464,958
Future Tech

When US President Joe Biden said Friday that social networks like Facebook are “killing people” with the viral spread of Covid-19 misinformation, the company tried to defend itself. In a strongly-worded blog post, an executive attempted to redirect people to more positive data, on the ways Facebook has spread good information.

But that didn’t address the critique. While it’s impossible to say whether misinformation on Facebook is actually “killing people”, and Biden walked back his comments on Monday, the problem Biden was flagging is real: Covid-19 misinformation is a big issue on Facebook, and one that hasn’t been fixed. Only Facebook knows how big.

The company says it has labelled (but not removed) 167 million posts containing Covid-19 misinformation since the pandemic began, and outside research from Avaaz, a non-profit group that has studied misinformation on the service, found Internet users are still finding and engaging with Covid-19 disinformation on Facebook more than anywhere else.

Facebook’s blog post, entitled “Moving Past the Finger Pointing”, argued Biden couldn’t back up his claims with facts. The company even took a shot at the President’s lofty but failed goal to get 70% of Americans vaccinated by July 4, pointing out its own data that shows Facebook users are increasingly interested in getting the vaccine. “The data shows that 85% of Facebook users in the US have been or want to be vaccinated against Covid-19,” Guy Rosen, vice president of integrity, wrote. “Facebook is not the reason this goal was missed.”

While Facebook has done a lot to try and combat pandemic misinformation, it hasn’t done enough to convince Biden - or a lot of other critics - that its positive efforts have outweighed the negative force of its algorithm, and its potential to spread lies and sensational claims. Facebook’s defense failed to include the one statistic that might actually remove the target off its back: Just how many people are exposed to vaccine misinformation on the service? And is the problem getting any better?

Measuring the impact of misinformation on social media has always been a major challenge, in part because nobody agrees what counts as harmful. Even Facebook says there is “no standard definition for vaccine misinformation”, according to a statement. But it’s also been challenging because companies like Facebook have never shared the full scope of their problem.

The social network offers glimpses in its blog post, saying it has removed over 18 million instances of Covid-19 misinformation, plus the 167 million posts that were flagged by company fact checkers. It’s unclear how much problematic content escaped Facebook’s enforcement.

Either way, the disclosed number is actually a small percentage of the total posts shared on Facebook services. Including private messages, Facebook users created 100 billion “pieces of content” per day, CEO Mark Zuckerberg told employees last year.

Of course, hundreds of millions of posts is still a lot of misinformation, and likely garnered billions of views. And not all views are created equal. Facebook’s algorithms often show people posts that are likely to resonate with them, meaning Covid-19 misinformation is more likely to reach people willing to believe it, magnifying its impact.

The only insight comes from third-party estimates. Avaaz published a June study that looked at which platforms generate the most user interactions on pandemic disinformation. Of the four sites examined - Facebook, Instagram, YouTube and Twitter - Facebook accounted for 68% of all the user interactions on posts with Covid-19 disinformation.

While Facebook was much better at labelling and fact checking disinformation than Google’s YouTube and Twitter, according to Avaaz, its sheer size means it still generated more user interactions on posts that slipped through the company’s fact checking apparatus. Of all the interactions on disinformation posts that went unlabelled, Facebook and Instagram were responsible for 50% of those post’s interactions, compared with 20% from YouTube and 30% from Twitter.

Facebook wants to be known as part of the solution to the problem. The company says more than two billion people “have viewed authoritative information about Covid-19 and vaccines” on the social network after Facebook created a special section of the service populated by reliable vaccine information. More than 3.3 million used Facebook to look up where to get a vaccine shot.

But that doesn’t address what people are seeing in their Facebook and Instagram feeds, or in their private WhatsApp chats. People spend the majority of their time on the apps scrolling through updates, not visiting special information sections devised by the company for their education. Understanding whether Facebook is causing more harm than good may never be possible, but is certainly not possible without a more holistic view of the company’s internal data.

A Facebook spokesperson declined to share data that outlined how many people see Covid-19 misinformation through its services, and until the company does, all the other stats it shares aren’t likely to change anyone’s opinion about how much good Facebook is doing. Especially President Biden’s.

 - Bloomberg

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment