YouTube experiments with ways to make its algorithm more addictive

YouTube experiments with ways to make its algorithm more addictive

28. September 2019 0 Von Horst Buchwald

YouTube experiments with ways to make its algorithm more addictive

New York, 28.9.2019

Recommendation algorithms are among the most powerful machine learning systems because they are able to influence the information we consume. Especially the algorithm of YouTube has an oversized influence. 70% of what users see is fed to them by recommendations.

This influence has been put to the test in recent years. Since the algorithm is optimized to get people to deal with videos, it tends to offer choices that reinforce what someone already likes or believes – thus developing an experience that can be classified as „addictive“ and that excludes other views. This often imposes the most extreme and controversial videos that studies have shown can quickly lead people to political radicalization.

While YouTube has said publicly that it is working to solve these problems, a new paper from Google, which includes Youtube and the MIT Technology Review, seems to tell a different story. It proposes updating the platform’s algorithm to recommend even more targeted content to users in the interests of greater engagement.

Currently, according to MIT researchers, YouTube’s recommendation system works like this: to fill the recommended video sidebar, it first puts together a selection list of several hundred videos by finding those that match the theme and other features of the one you’re watching. Then it arranges the list according to the preferences of the user it is learning by inserting all your clicks, preferences and other interactions into a machine learning algorithm.

Among the proposed updates, researchers specifically target a problem that they identify as „implicit distortion“. It refers to the way recommendations themselves can affect user behavior, which makes it difficult to decipher whether you clicked on a video because you liked it or because it was highly recommended. The effect is that, over time, the system will be able to remove users further and further from the videos they actually want to watch.

In order to reduce this distortion, the researchers propose an adaptation of the algorithm: Each time a user clicks on a video, he or she also takes into account the rank of the video in the recommendation bar. Videos that are near the top of the sidebar are less weighted when read into the Machine Learning algorithm; videos that are low in the rankings and from which a user must scroll get more. When the researchers tested the changes live on YouTube, they found significantly more user loyalty.

Although the paper doesn’t say whether the new system will be used permanently, Guillaume Chaslot, a former YouTube engineer who now runs AlgoTransparency.org, said he was „pretty confident“ that it would go relatively quickly: „They said it extended the monitoring time by 0.24%. If you calculate the amount, I think that’s maybe tens of billions of dollars.“

Several experts who reviewed the paper said the changes could have perverse effects. „In our research, we found that YouTube’s algorithms created an isolated right-wing extremist community that pushed users to videos of children and encouraged misinformation,“ said Jonas Kaiser, a partner at the Berkman Klein Center for Internet & Society. „On the outskirts of the country, this change could encourage the formation of more isolated communities than we have seen before. Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, said that while „reducing positional bias is a good way to start slowing down the feedback loop for inferior content,“ the change could theoretically further encourage extreme content.

Becca Lewis, a former Data & Society researcher on online extremism, said it was difficult to know how the changes would impact. „There are so many different communities on YouTube, different ways to use YouTube, different types of content, that the impact will be different in so many cases. We will be test subjects for YouTube.“

When asked for an opinion, a YouTube spokesperson said his engineers and product teams had found that the changes would not lead to filter bubbles. In contrast, the company expects the changes to reduce them and diversify the recommendations overall.

All three external researchers who contacted MIT Technology Review recommend that YouTube spend more time investigating the impact of algorithmic changes through methods such as interviews, surveys and user input. YouTube has done this to some extent, the spokesman said, working to remove extreme content in the form of hate speech on its platform.

„YouTube should devote more energy to understanding which actors prefer and reinforce their algorithms than how to keep users on the platform,“ Kaiser said.

„The frustrating thing is that it’s not in YouTube’s business interest to do this,“ Lewis added. „But there is an ethical imperative.“