THE 14TH ANNUAL SHORTY AWARDS

The Shorty Awards honor the best of social media and digital. View this season's finalists!
From the 6th Annual Shorty Impact Awards

YouTube Regrets Reporter

Winner in Grassroots Efforts

Objectives

Got YouTube Regrets? Thousands of people do. Countless news stories have detailed how YouTube's recommendation engine suggests people watch harmful videos. Even former Google engineers who worked on YouTube’s algorithm felt compelled to speak out about it in the news and in documentaries like The Social Dilemma, yet YouTube continued to avoid the topic and shirk responsibility. We sent out a call to our supporters in 2019 asking them to share stories about the videos they wished they never clicked, that sent them down rabbit holes they regretted. We received more than 2,500 harrowing stories about YouTube videos driving radicalisation, eating disorders, conspiracy theories and more.

 

We wanted to learn more, so we launched a browser extension called Regrets Reporter to crowdsource data that we could use to investigate what YouTube was recommending to people around the world and the impact that it was having on them. Our call to action was to install a browser extension on your desktop computer, and send us a “report” every time you watch a video you regret.

 

It was a lot to ask, but the cause was an important one. YouTube’s algorithm drives 700 million hours of watch time on the platform every day, yet the algorithm is a complete black box, and YouTube intends to keep it that way.

 

Our goal was to generate media and policymaker attention that would help pressure YouTube to take people’s safety on their platform more seriously, and to be more transparent about how their algorithm works.

Strategy and Execution

To crowdsource enough data to scrutinise YouTube’s recommendation engine, we launched the RegretsReporter extension, which allowed users to immediately take action, and send us recommended videos that they regretted watching —like pseudoscience, vaccine misinformation or anti-LGBTQ+ content. Since the algorithm is a total black box, getting people to directly share the recommendations that they’re seeing is one of the only ways that we can study it.

 

 

When data donors report a video, the report form asked them to tell us more about the YouTube Regret, and collect information about the recommendations that led them to the video. By sharing their experiences, donors could help us answer questions like: what kinds of recommended videos do users regret watching? What are the pathways that lead them down a rabbit hole? Are some countries, or people who watch videos in certain languages more affected than others?

 

 

We knew we needed a lot of data to draw conclusions that YouTube would take seriously. We also knew that we needed to communicate the impact people’s data donations could make. We framed the problem in simple terms that everyone could relate to: Are your YouTube Recommendations sometimes lies? Conspiracy theories? Or just weird as hell? You’re not alone, and together we can change that. 

 

 

We shared the browser extension with our supporters on Instagram, Twitter, LinkedIn, YouTube and our email list. We also shared it with partner organisations and asked them to spread the word. 

 

 

In addition, we launched an advertising campaign on Twitter, YouTube, and an ad display network that specialises in reaching marginalised communities. We targeted our ads exclusively to people on desktop, so they could install the plugin as soon as they clicked it.

 

 

We ultimately recruited more than 37,000 people across 190 countries (that’s nearly every country in the world!) to download RegretsReporter and donate their data, resulting in the largest-ever crowdsourced investigation of YouTube’s algorithm.  

 

 

After delving into the data with a team of research assistants from the University of Exeter, Mozilla came away with three major findings: 

 

 

YouTube recommended videos to people that violated their own policies and were later removed from the platform (after racking up millions of views). YouTube routinely recommends misinformation, scams, graphic and violent content —even disturbing and sexual “children’s” content was flagged to us. The problem is worse in non-English speaking countries, and especially when it comes to pandemic-related videos.

 

We released the data as a report on our website and as a PDF which we shared with journalists under embargo ahead of launch day, alongside a video which explained our findings to the general public. We also included three sets of recommendations for what can be done to fix this problem, addressed to YouTube, to policymakers and to people who use the platform.

Results

In total we had 37,380 people install the extension and had 3,362 reports of regrettable content from 91 countries, resulting in the largest ever crowdsourced investigation of YouTube’s algorithm. Reports were submitted between July 2020 - May 2021.

 

We released the data as a report on our website and as a PDF which we shared with journalists under embargo ahead of launch day, alongside a video which explained our findings to the general public. We also included three sets of recommendations for what can be done to fix this problem, addressed to YouTube, to policymakers and to people who use the platform.

In total we had...

177 news articles from 30 countries including Wall Street Journal, NBC News, Vanity Fair, and The Verge 2,398 mentions on Twitter in the month of July 2021 400,000+ views of our video reporting results back to our supporters and the general public 15,646 visitors to our website Our research was cited in the EU’s Digital Services Act (DSA) impact assessment The Washington Post Editorial Board wrote an opinion piece calling for social media platforms to be more transparent about their algorithms, which centred on our research YouTube released a series of blog posts with more information about their algorithm, including what they are doing (and will do in the future) to reduce recommendations of harmful content European policymakers invited us to input on amendments to the EU’s Digital Services Act intended to address transparency in recommender systems

Media

Video for YouTube Regrets Reporter

Entrant Company / Organization Name

Mozilla

Link