2019-10-15 07:00 |
Mozilla just launched a site featuring 28 user-submitted stories, detailing incidents where YouTube’s recommendation algorithm served bizarre and horrifying videos the users had shown no interest in.
This included recommendations featuring racism, conspiracies, and violence. YouTube’s recommendation algorithm has faced a lot of scrutiny this year for radicalization, pedophilia, and for generally being “toxic” — which is problematic because 70 percent of the platform’s viewing time comes from recommendations. That’s why Mozilla launched the #YouTubeRegrets project, to highlight the issue and urge YouTube to change its practice. The stories of the darker sides of YouTube’s recommendations are chilling, and put the…
This story continues at The Next Web
.
Similar to Notcoin - Blum - Airdrops In 2024