No products in the cart.
Violent videos and misinformation are amplified by the YouTube algorithm, despite the according to a report by the Mozilla Foundation released on July 7th to limit its spread.
The foundation – a non-profit advocating privacy issues – found that 71 percent of all videos reported as intrusive by volunteers were recommended by the video-sharing platform’s algorithm. This included conspiracies over 9/11 and the coronavirus pandemic, as well as promoting white supremacy.
The researchers also found that people in non-English speaking countries were more likely to come across videos that they find annoying, suggesting that YouTube’s efforts to better monitor its platforms have been inconsistent.
“YouTube’s algorithm is working to amplify some really harmful content and push people onto disruptive paths,” said Brandi Geurkink, the foundation’s senior manager of advocacy. “It really shows that his recommendation algorithm isn’t even working to support his own platform guidelines, he’s actually going off the rails.”
A YouTube spokesperson said, “The goal of our recommendation system is to connect viewers with content they love, and we recommend more than 200 million videos daily on the homepage alone.”
The spokesman said YouTube is “constantly” working to improve the user experience and has made 30 different changes over the past year to reduce recommendations on harmful content.
YouTube, which is owned by Google, and similar platforms have long refused to share information about their algorithms because they claim they violate trade secrets and user privacy.
But growing evidence is implying social media’s recommendation algorithms in spreading misinformation and violent content. Researchers coined the concept of “algorithmic radicalization” to describe how recommendation algorithms drive users to more extreme content. The studies convinced lawmakers to create new rules to undermine the opaque algorithms of open technology platforms known as “black boxes”. Governments are also pushing for new laws to force social media to better monitor their platforms and not simply rely on their own content policies.
YouTube is the most visited website in the world after Google. Users watch around a billion hours of videos on their platform every day.
Mozilla’s research is based on a browser extension that allowed more than 37,000 users from 91 countries to report “unfortunate recommendations” and donate data on how they spent their time on YouTube over 10 months. Mozilla described it as the largest crowdsourced study of the YouTube algorithm to date.
This article is part of POLITICSPremium tech insurance coverage: Pro Technology. Our expert journalism and suite of policy intelligence tools enable you to seamlessly search, track and understand the developments and stakeholders shaping EU technology policy and making decisions that affect your industry. E-mail [email protected] with the code ‘TECH’ for a free trial.