Internet

Youtube algorithm recommends videos that violate its policies

A report created by the Mozilla Foundation shows that a huge number of videos recommended by Youtube violate the policies of use of the platform itself. The algorithm that is responsible for offering you videos may not be well prepared to offer you the content you really want to see.

This research has confirmed that youtube’s algorithm that recommends videos to its users seems to be not paying attention to whether the video they offer can cause another similar but not the same video to be recommended. And you get into a maelstrom of videos that can become offensive or that you don’t want to watch.

Youtube and its recommended algorithm

YouTube represents one of the most sophisticated and scaleplatform for watching and posting videos. Youtube has an algorithm that works as a high-level system that is based on improvements in its systems based on deep learning.

In a research paper published in 2016,a group of Google engineers shared their plans on how videos might appear through YouTube’s recommendation engine for a better user experience.

This work is quite relevant since according to youTube’s product manager, 70% of YouTube views  now come from this recommendation engine. That is, you go to watch a video on Youtube and the platform offers you, as a gift, other videos that it recommends precisely why you are watching the current video.

Disturbing videos you don’t want to watch

The Mozilla foundation has presented a paper with quite a few examples of people complaining because youtube’s algorithm offers them videos they don’t want to see. Or even, that they are not appropriate for your children. On many occasions, you look for a specific topic such as videos of a drag queen and in the recommendations they offer you videos full of LGBT hatred.

There is also the case that you look for information about science and end up seeing that you are recommended videos of conspiracies, illuminated, stories of aliens, pyramid scams, (see Ads on YouTube that lead to fake pages, etc.

Recommendations made by the Mozilla Foundation

In view of the work done, the Mozilla Foundation is working on a series of tips for Youtube and other platforms to improve their video recommendation processes. These tips referring to Youtube and similar platforms would be:

  1. Allow independent audits of recommendation systems. Having a third party check how the system works and can provide an objective point of view for improving the system.
  2. Publish information on how recommendation systems work and transparency reports that provide sufficient information about problem areas and progress over time. This point is quite complex that can occur, since if you had in your possession the formula of Coca-Cola, it is unlikely that you will share it with other people. YouTube transparency reports should provide meaningful information about the interaction between content moderation measures and recommendation systems. Recommendation systems can contribute to the spread of harmful content and it should be independently verified whether the steps YouTube is taking to reduce harmful content recommendations are working as they say.
  3. Give people more control over how their data is used as input to deliver recommendations and the outcome of those recommendations. Since you are giving up your personal data and Youtube knows a lot about yourself, it would be convenient to be able to control in some way what content you want to see. It would be very correct to exclude videos from topics that do not interest you. These controls should be accessible through centralized user configurations, as well as built into the interface that displays algorithmic recommendations.
  4. Implement rigorous and recurring risk management programs that are dedicated to recommendation systems. Platforms must systematically identify, evaluate and manage on an ongoing basis. Platforms should systematically identify, assess and manage on an ongoing basis and substantiate the risks to individuals and the public interest that may arise from the design, operation or use of the recommendation system.
  5. Let people choose not to personalize themselves. Platforms, including YouTube, should give people the option to opt-out of personalized recommendations in favor of receiving chronological, contextual, or search-based recommendations.

References

Avelino Dominguez

👨🏻‍🔬 Biologist 👨🏻‍🎓 Teacher 👨🏻‍💻 Technologist 📊 Statistician 🕸 #SEO #SocialNetwork #Web #Data ♟Chess 🐙 Galician

View Comments

Recent Posts

How to Replace Text in an Image Using Artificial Intelligence

If you've used artificial intelligence to create images with text, you may have noticed that…

4 months ago

Cómo reemplazar texto de una imagen mediante inteligencia artificial

Si has utilizado la inteligencia artificial para crear imágenes con texto quizás te hayas dado…

4 months ago

Best Ways to Protect Your Passwords in 2024

Security breaches and cyberattacks are still major headaches today. Until something different is invented, consumers…

5 months ago

Mejores formas de proteger tus contraseñas en 2024

Las brechas de seguridad y los ciberataques siguen siendo importantes quebraderos de cabeza hoy en…

5 months ago

Top HTML tags to get started in web design

HTML, which stands for HyperText Markup Language, is the standard language used to create web…

5 months ago

Principales etiquetas HTML para empezar a trabajar en el diseño web

HTML, que significa Lenguaje de Marcado de Hipertexto (por sus siglas en inglés, HyperText Markup…

5 months ago