SOCIAL MEDIA

Threads Works to Address ‘Borderline’ Content Recommendations In-Stream

[ad_1]

Seeing more junk recommendations in your “For You” feed on Threads?

You’re not alone. According to Instagram Chief Adam Mosseri, this has become a problem for the app, and the Threads team is working to fix it.

As outlined by Mosseri, more Threads users have been shown more borderline content in the app, which is a problem that the team is working to fix, as it continues to improve the 6-month-old platform.

Though the borderline content issue is not a new one for social apps.

Back in 2018, Meta chief Mark Zuckerberg provided a broad overview of the ongoing issues with content consumption, and how controversial content inevitably always gains more traction.

As per Zuckerberg:

One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services.

Zuckerberg engagement curve

Zuckerberg further noted that this is a difficult challenge to solve, because “no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average – even when they tell us afterwards they don’t like the content.”

It seems that Threads is now falling into the same trap, possibly due to its rapid growth, possibly due to the real-time refinement of its systems. But this is how all social networks evolve, with controversial content getting a bigger push, because that is actually what a lot of people are going to engage with.

Though you would have hoped that Meta would have a better system in place to deal with such, after working on platform algorithms for longer than anyone.

In his 2018 overview, Zuckerberg identified de-amplification as the best way to address this element.

“This is a basic incentive problem that we can address by penalizing borderline content so it gets less distribution and engagement. [That means that] distribution declines as content gets more sensational, and people are therefore disincentivized from creating provocative content that is as close to the line as possible.

In theory, this may work, but evidently, that hasn’t been the case on Threads, which is still trying to work out how to provide the optimal user experience, which means showing users the most engaging, interesting content.

It’s a difficult balance, because as Zuckerberg notes, often users will engage with this type of material even if they say they don’t like it. That means that it’s generally a process of trial and error, in showing users more borderline stuff to see how they react, then reducing it, almost on a user-by-user basis.

Essentially, this is not a simple problem to solve on a broad scale, but the Threads team is working to improve the algorithm to highlight more relevant, less controversial content, while also maximizing retention and engagement.

My guess is the increase in this content has been a bit of a test to see if that’s what more people want, while also dealing with an influx of new users who are testing the algorithm to find out what works. But now, it’s working to correct the balance.   

So if you’re seeing more junk, this is why, and you should now, according to Mosseri, be seeing less.

[ad_2]
Source link

Related Articles

Back to top button