August 6, 2021
2 mins read

Youtube algorithm recommends videos that violate its policies

algoritmo algorithm
Photo by Adam Fejes on Pexels.com

A report created by the Mozilla Foundation shows that a huge number of videos recommended by Youtube violate the policies of use of the platform itself. The algorithm that is responsible for offering you videos may not be well prepared to offer you the content you really want to see.

This research has confirmed that youtube’s algorithm that recommends videos to its users seems to be not paying attention to whether the video they offer can cause another similar but not the same video to be recommended. And you get into a maelstrom of videos that can become offensive or that you don’t want to watch.

Youtube and its recommended algorithm

YouTube represents one of the most sophisticated and scaleplatform for watching and posting videos. Youtube has an algorithm that works as a high-level system that is based on improvements in its systems based on deep learning.

In a research paper published in 2016,a group of Google engineers shared their plans on how videos might appear through YouTube’s recommendation engine for a better user experience.

This work is quite relevant since according to youTube’s product manager, 70% of YouTube views  now come from this recommendation engine. That is, you go to watch a video on Youtube and the platform offers you, as a gift, other videos that it recommends precisely why you are watching the current video.

Disturbing videos you don’t want to watch

The Mozilla foundation has presented a paper with quite a few examples of people complaining because youtube’s algorithm offers them videos they don’t want to see. Or even, that they are not appropriate for your children. On many occasions, you look for a specific topic such as videos of a drag queen and in the recommendations they offer you videos full of LGBT hatred.

There is also the case that you look for information about science and end up seeing that you are recommended videos of conspiracies, illuminated, stories of aliens, pyramid scams, (see Ads on YouTube that lead to fake pages, etc.

Recommendations made by the Mozilla Foundation

In view of the work done, the Mozilla Foundation is working on a series of tips for Youtube and other platforms to improve their video recommendation processes. These tips referring to Youtube and similar platforms would be:

  1. Allow independent audits of recommendation systems. Having a third party check how the system works and can provide an objective point of view for improving the system.
  2. Publish information on how recommendation systems work and transparency reports that provide sufficient information about problem areas and progress over time. This point is quite complex that can occur, since if you had in your possession the formula of Coca-Cola, it is unlikely that you will share it with other people. YouTube transparency reports should provide meaningful information about the interaction between content moderation measures and recommendation systems. Recommendation systems can contribute to the spread of harmful content and it should be independently verified whether the steps YouTube is taking to reduce harmful content recommendations are working as they say.
  3. Give people more control over how their data is used as input to deliver recommendations and the outcome of those recommendations. Since you are giving up your personal data and Youtube knows a lot about yourself, it would be convenient to be able to control in some way what content you want to see. It would be very correct to exclude videos from topics that do not interest you. These controls should be accessible through centralized user configurations, as well as built into the interface that displays algorithmic recommendations.
  4. Implement rigorous and recurring risk management programs that are dedicated to recommendation systems. Platforms must systematically identify, evaluate and manage on an ongoing basis. Platforms should systematically identify, assess and manage on an ongoing basis and substantiate the risks to individuals and the public interest that may arise from the design, operation or use of the recommendation system.
  5. Let people choose not to personalize themselves. Platforms, including YouTube, should give people the option to opt-out of personalized recommendations in favor of receiving chronological, contextual, or search-based recommendations.

References

Avelino Dominguez

??‍? Biologist ??‍? Teacher ??‍? Technologist ? Statistician ? #SEO #SocialNetwork #Web #Data ♟Chess ? Galician

1 Comment Leave a Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

youtube website on laptop screen
Previous Story

Algoritmo de Youtube recomienda videos que violan sus políticas

exploración de redes
Next Story

Tools for network scanning

Top

Don't Miss

genmo

GenMo, generate images and videos using artificial intelligence

GenMo is a creative, multimodal chatbot…
chatgpt, cute, pokemon

Discovering the most amazing uses of ChatGPT

ChatGPT is an amazing tool that…