The change in the YouTube algorithm worries experts

By:

Gabriel E. Levy B.

As YouTube evolved, its owner, Alphabet Inc. (Google), focused on perfecting the platform’s algorithm, giving it the reach of a powerful social media. 70% of the content that users watch originates from the platform’s automatic recommendations, generating an unprecedented type of adherence and addiction in the consumption of online video, a situation that has not been unnoticed and its impact has been the cause of multiple analyses and possible controversies.

Recently, information was leaked that YouTube would be working on a supposed improvement of its recommendation algorithm, triggering concern in certain specialized sectors that have been monitoring the behavior of this platform.

Why do experts worry about the possible change in the YouTube algorithm?

 The issue in context

 When the three former PayPal employees Chad Hurley, Steve Chen and Jawed Karim founded YouTube in 2005, they never imagined how successful it would become over the years. Today the video platform has nearly 1.8 billion active users, is available in over 80 languages, and averages over 300 hours of video per minute and users consume about 3 billion hours per month. It is the second largest search engine in the world, and the third most visited site after Google and Facebook. In an average month, 8 out of 10 people between the ages of 18 and 49 in the world consume YouTube content [1].

For Alphabet Inc. (Google), owner of YouTube since 2006, the main priority over the years has been to improve the algorithm whereby it operates the search platform to recommend and view videos on YouTube and it has been so successful that, according to a report by the Massachusetts Institute of Technology’s popular science journal:

YouTube’s recommendation algorithms are some of the most powerful machine-learning systems today because of their ability to shape the information we consume. YouTube’s algorithm, especially, has an outsize influence. The platform is estimated to be second only to Google in web traffic, and 70% of what users watch is fed to them through recommendations[2].

MIT Technology Review

However, as analyzed by the MIT report itself, “this influence has been the subject of intense analysis“, since it is optimized so that users who consume it “get hooked on videos“, reinforcing what the user already likes due to obvious reasons, creating a kind of addiction, which on one hand mono-thematizes the consumption of videos, and on the other hand prevents the offer from being diverse and plural.

All of the above would not be a major problem, except that some analyses suggest that this approach generally privileges the more “extreme and controversial” videos, which could push people into a deep content hole and political radicalization, according to a study published at “Data & Society” [3].

What Google proposes

Aware of the impact that is triggering the use of its algorithm, a group of researchers responsible for the issue within Google, recently published a study called: “Recommending what video to watch next: a multitask ranking system [4]” where they propose a solution to the current problems: an update of YouTube’s base algorithm to recommend even more specific content to users, achieving a higher level of specificity and detail in the recommendation process, from the characteristics of individual profiling.

Currently the logic of the algorithm is based on a system of recommendations built from a list of several hundred videos that have some kind of coincidence with the topic being consulted or viewed by the user of the platform, including some specific characteristics, then rearranges the list according to the registered preferences based on the historical behavior of the consumer in relation to the clicks used, the “likes” and other interactions, variables that when crossed make up the inventory for future recommendations.

In the new update proposed by the Google researchers, while maintaining the same principle of operation of the algorithm, they now concentrate their efforts on avoiding what they have called “implicit bias“, a concept that refers to “the influence of the recommendations on user behavior [5]”.

According to the results of the research carried out by the group of engineers, “the implicit bias makes it difficult to find out if the user clicked on a video because he or she liked it or because it was highly recommended [6]”, generating as a possible effect, that over time, the same algorithm ends up distancing users more and more of the videos that may actually be useful according to their expectations, falling into a kind of spiral similar to what happens with industrially produced music, that is, it is consumed because it is fashionable, many people have seen it, it is similar to other seen things, but it is not necessarily what the user wants to watch.

As a solution to reduce this bias, the study links a proposal where every time a YouTube user clicks on a video, the position in the sidebar recommendations is evaluated, making this data a primary source of information, and thus, the videos that were presented in another context as the first option, now have less impact and relevance in the offer that is provided to the user.

With this methodology, researchers seek to avoid the contagion effect generated by the experience of other users and thus achieve a much more personalized experience and in accordance with the interests of each visitor.

The first results of the tests applied in this algorithm change showed a higher interaction level and interest by users and there were elements that allow to conclude that this methodology generates a lesser effect of contagion and a greater appreciation of the real interests of the viewer.

What are the experts worried about?

According to the analysis carried out by the group of researchers from MIT Technology Review, the modifications proposed in the Google study are not as beneficial as they are intended and, on the contrary, they could potentially trigger: “perverse effects” [7], since, although they consider that the reduction of the bias is important, it might be promoting a much greater risk: polarization and isolation.

For Jonas Kaiser, a member of the Berkman Klein Center for Internet & Society, cited in the MIT analysis reference, the change might have serious consequences:

In our research, we have found that YouTube’s algorithms have created an isolated right-wing extremist community, pushed other users into videos with children, and promoted misinformation. This change could […] encourage the formation of even more isolated communities”.

Jonas Kaiser, member of the Berkman Klein Center for Internet and Society

On his part, the Director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, Jonathan Albright, told the MIT magazine that although “reducing the bias of the position is a good start to stop the feedback of low-quality content“, this change might further favor extreme content [8].

Much more than political correctness

While for some people it might be a very thin thread to discuss issues as specific as programming an algorithm and ultimately it would be a minor impact on all the benefits that a platform like YouTube represents, it is important not to lose sight of the fact that there is a big risk factor involved, such as the hate speech that gives rise to certain extremist acts. It should be remembered that in many of the famous violent acts that have occurred in recent years, YouTube has emerged as the ideological epicenter of concentration of these acts, and not long ago, large commercial companies decided to suspend their advertising on this platform, as they considered that Google had not done enough to prevent racist, xenophobic and extremist content.

In conclusion, YouTube has become the most successful online video platform globally and this success is largely due to the effectiveness of its content recommendation algorithm. However, it has been seen over the years that YouTube could be promoting extremist positions that generate no lesser social risk, while at the same time generating a kind of contagion over content consumption by homogenizing viewer behavior. In order to solve this problem, YouTube will be implementing changes in its algorithm in the coming months, seeking to reduce what they have called “”implicit bias””, but this change does not convince the experts, who consider that this supposed level of customization will end up causing greater polarization and will exponentially increase the current problems already identified.

[1] Official Figures from the Statistic Brain Research Institute

[2] Specialized article from the MIT Technology Review Magazine

[3] Published study on Political Radicalization

[4] Google published study

[5] Study published by Google researchers with proposal for improving its algorithm

[6] Study published by Google researchers with proposal for improving its algorithm

[7] Specialized article from the MIT Technology Review Magazine

[8] Specialized article from the MIT Technology Review Magazine