
Mozilla’s RegretsReporter data shows YouTube keeps recommending harmful videos
Reported today on The Verge
For the full article visit: https://www.theverge.com/2021/7/7/22567640/youtube-algorithm-suggestions-radicalization-mozilla
Reported today in The Verge.
Mozilla's RegretsReporter data shows YouTube keeps recommending harmful videos
That the machine learning-driven feed of YouTube recommendations can frequently surface results of an edgy or even radicalizing bent isn't much of a question anymore. YouTube itself has pushed tools that it says could give users more control over their feed and transparency about certain recommendations, but it's difficult for outsiders to know what kind of impact they're having. Now, after spending much of the last year collecting data from the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information on what people see when the algorithm makes the wrong choice and has released a detailed report (pdf).
In September 2020 the extension launched, taking a crowdsourced approach to find "regrettable" content that people encounter via the recommendation engine. After receiving 3,362 reports (along with data from people who installed the extension but did not submit reports), trends in the data show the danger in YouTube's approach.
While the foundation says it kept the concept of a "regret" vague on purpose, it judged that 12.2 percent of reported videos violated YouTube's own rules for content, and noted that about nine percent of them (nearly 200 in total) have been removed from YouTube - after accruing over 160 million views. As far as why those videos were posted in the first place, a possible explanation is that they're popular - Mozilla noted that reported videos averaged 70 percent more views per day than other videos watched by volunteers.
Mozilla senior director of advocacy Brandy Guerkink says "YouTube needs to admit their algorithm is designed in a way that harms and misinforms people." Still, two stats in pa