A report created by the Mozilla Foundation shows that a huge number of videos recommended by Youtube violate the policies of use of the platform itself. The algorithm that is responsible for offering you videos may not be well prepared to offer you the content you really want to see.
This research has confirmed that youtube’s algorithm that recommends videos to its users seems to be not paying attention to whether the video they offer can cause another similar but not the same video to be recommended. And you get into a maelstrom of videos that can become offensive or that you don’t want to watch.
Youtube and its recommended algorithm
YouTube represents one of the most sophisticated and scaleplatform for watching and posting videos. Youtube has an algorithm that works as a high-level system that is based on improvements in its systems based on deep learning.
In a research paper published in 2016,a group of Google engineers shared their plans on how videos might appear through YouTube’s recommendation engine for a better user experience.
This work is quite relevant since according to youTube’s product manager, 70% of YouTube views now come from this recommendation engine. That is, you go to watch a video on Youtube and the platform offers you, as a gift, other videos that it recommends precisely why you are watching the current video.
Disturbing videos you don’t want to watch
The Mozilla foundation has presented a paper with quite a few examples of people complaining because youtube’s algorithm offers them videos they don’t want to see. Or even, that they are not appropriate for your children. On many occasions, you look for a specific topic such as videos of a drag queen and in the recommendations they offer you videos full of LGBT hatred.
There is also the case that you look for information about science and end up seeing that you are recommended videos of conspiracies, illuminated, stories of aliens, pyramid scams, (see Ads on YouTube that lead to fake pages, etc.
Recommendations made by the Mozilla Foundation
In view of the work done, the Mozilla Foundation is working on a series of tips for Youtube and other platforms to improve their video recommendation processes. These tips referring to Youtube and similar platforms would be:
- Allow independent audits of recommendation systems. Having a third party check how the system works and can provide an objective point of view for improving the system.
- Publish information on how recommendation systems work and transparency reports that provide sufficient information about problem areas and progress over time. This point is quite complex that can occur, since if you had in your possession the formula of Coca-Cola, it is unlikely that you will share it with other people. YouTube transparency reports should provide meaningful information about the interaction between content moderation measures and recommendation systems. Recommendation systems can contribute to the spread of harmful content and it should be independently verified whether the steps YouTube is taking to reduce harmful content recommendations are working as they say.
- Give people more control over how their data is used as input to deliver recommendations and the outcome of those recommendations. Since you are giving up your personal data and Youtube knows a lot about yourself, it would be convenient to be able to control in some way what content you want to see. It would be very correct to exclude videos from topics that do not interest you. These controls should be accessible through centralized user configurations, as well as built into the interface that displays algorithmic recommendations.
- Implement rigorous and recurring risk management programs that are dedicated to recommendation systems. Platforms must systematically identify, evaluate and manage on an ongoing basis. Platforms should systematically identify, assess and manage on an ongoing basis and substantiate the risks to individuals and the public interest that may arise from the design, operation or use of the recommendation system.
- Let people choose not to personalize themselves. Platforms, including YouTube, should give people the option to opt-out of personalized recommendations in favor of receiving chronological, contextual, or search-based recommendations.
References
- https://foundation.mozilla.org/es/blog/mozilla-investigation-youtube-algorithm-recommends-videos-that-violate-the-platforms-very-own-policies/
- https://assets.mofoprod.net/network/documents/Mozilla_YouTube_Regrets_Report.pdf
- https://foundation.mozilla.org/es/campaigns/youtube-regrets/
- https://www.shopify.com/blog/youtube-algorithm
- https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45530.pdf
[…] a serious problem. Many of those ads violate Youtube policies as I already mentioned in the article Youtube algorithm recommends videos that violate its policies. When you can’t control everything you have to automate tasks and the ads are displayed based […]