Future Tech

How social media filter bubbles work

Tan KW
Publish date: Fri, 24 Sep 2021, 10:33 AM
Tan KW
0 464,902
Future Tech

On the surface this seems reasonable. If people like credible news, expert opinions and fun videos, these algorithms should identify such high-quality content. But the wisdom of the crowds makes a key assumption here: that recommending what is popular will help high-quality content "bubble up."

We tested this assumption by studying an algorithm that ranks items using a mix of quality and popularity. We found that in general, popularity bias is more likely to lower the overall quality of content. The reason is that engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality item is large enough, it will keep getting amplified.

Algorithms aren't the only thing affected by engagement bias - it can affect people too. Evidence shows that information is transmitted via "complex contagion," meaning the more times people are exposed to an idea online, the more likely they are to adopt and reshare it. When social media tells people an item is going viral, their cognitive biases kick in and translate into the irresistible urge to pay attention to it and share it.

Not-so-wise crowds

We recently ran an experiment using a news literacy app called Fakey. It is a game developed by our lab, which simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking news from reliable sources and for flagging low-credibility articles for fact-checking.

We found that players are more likely to like or share and less likely to flag articles from low-credibility sources when players can see that many other users have engaged with those articles. Exposure to the engagement metrics thus creates a vulnerability.

The wisdom of the crowds fails because it is built on the false assumption that the crowd is made up of diverse, independent sources. There may be several reasons this is not the case.

First, because of people's tendency to associate with similar people, their online neighbourhoods are not very diverse. The ease with which social media users can unfriend those with whom they disagree pushes people into homogeneous communities, often referred to as echo chambers.

Second, because many people's friends are friends of one another, they influence one another. A famous experiment demonstrated that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment.

Third, popularity signals can be gamed. Over the years, search engines have developed sophisticated techniques to counter so-called "link farms" and other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about their own vulnerabilities.

People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organised fake networks. They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people's cognitive biases at once. They have even altered the structure of social networks to create illusions about majority opinions.

Dialing down engagement

What to do? Technology platforms are currently on the defensive. They are becoming more aggressive during elections in taking down fake accounts and harmful misinformation. But these efforts can be akin to a game of whack-a-mole.

A different, preventive approach would be to add friction. In other words, to slow down the process of spreading information. High-frequency behaviours such as automated liking and sharing could be inhibited by CAPTCHA tests or fees. Not only would this decrease opportunities for manipulation, but with less information people would be able to pay more attention to what they see. It would leave less room for engagement bias to affect people's decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine the content they serve you. Perhaps the revelations of Facebook's knowledge of troll farms exploiting engagement will provide the necessary impetus.

 

 - TNS

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment