Facebook has beenquietly experimentingwith reducing the amount of political content it puts in users news feeds.

The move is a tacit acknowledgment that the way the companys algorithms workcan be a problem.

The heart of the matter is the distinction between provoking a response and providing content people want.

Your social media ‘engagement’ is making you vulnerable to manipulation

I also see substantial pitfalls in how the social media companies do so in practice.

For example,collective predictionsare normally more accurate than individual ones.

Collective intelligence is used to predictfinancial markets, sports,electionsand evendisease outbreaks.

The Conversation

You may not know why, but its wiser to ask questions later.

These rules work remarkably well in typical situations because they are based on sound assumptions.

On the surface this seems reasonable.

If people like credible news, expert opinions and fun videos, these algorithms should identify such high-quality content.

Wetested this assumptionby studying an algorithm that ranks items using a mix of quality and popularity.

We found that in general, popularity bias is more likely to lower the overall quality of content.

Once the popularity of a low-quality item is large enough, it will keep getting amplified.

Algorithms arent the only thing affected by engagement bias it canaffect people, too.

Not-so-wise crowds

We recently ran an experiment usinga news literacy app called Fakey.

They get points for sharing or liking news from reliable sources and for flagging low-credibility articles for fact-checking.

Exposure to the engagement metrics thus creates a vulnerability.

There may be several reasons this is not the case.

First, because of peoples tendency to associate with similar people, their online neighborhoods are not very diverse.

Second, because many peoples friends are friends of each other, they influence each other.

Afamous experimentdemonstrated that knowing what music your friends like affects your own stated preferences.

Your social desire to conform distorts your independent judgment.

Third, popularity signals can be gamed.

Social media platforms, on the other hand, are just beginning to learn about their ownvulnerabilities.

People aiming to manipulate the information market have createdfake accounts, like trolls andsocial bots, andorganizedfake networks.

They have evenaltered the structure of social networksto createillusions about majority opinions.

Dialing down engagement

What to do?

Technology platforms are currently on the defensive.

They are becoming moreaggressiveduring elections intaking down fake accounts and harmful misinformation.

But these efforts can be akin to a game ofwhack-a-mole.

A different, preventive approach would be to addfriction.

In other words, to slow down the process of spreading information.

High-frequency behaviors such as automated liking and sharing could be inhibited byCAPTCHAtests or fees.

It would leave less room for engagement bias to affect peoples decisions.

Also tagged with