Biases in algorithms and datasets result in unfair practices toward communities of color, often without them knowing. In May 2023 Consumer Reports, in partnership with Kapor Foundation, released BAD INPUT, a series of short films investigating how discriminatory technology affects people’s health, finances, and privacy. Our goals were to create public awareness and dialogue around how algorithmic bias impacts people’s lives and to bring individuals together to take action to address these harms. We connected partner organizations, subject matter experts, advocates, and concerned citizens for this public education campaign anchored in the documentary film series BAD INPUT.
“Algorithms are now present in our daily lives and can have detrimental effects to communities of color and marginalized groups in areas such as mortgage lending, medical devices, and facial recognition technology,” said Amira Dhalla, director of impact partnerships and programs at Consumer Reports. “BAD INPUT explores these challenges and how we can begin addressing potential harms with greater transparency and accountability so that no one is negatively impacted by these hidden, prejudiced systems.”
BAD INPUT is part of Consumer Reports’ continuing work to use research and storytelling to harness collective consumer power to create societal change and push for more industry accountability and governmental action.
To shine a spotlight on how algorithmic bias impacts people’s daily lives and long-term outcomes, we identified key areas where the risks hidden in seemingly “neutral” technologies can negatively impact health, wealth building, and privacy.
In partnership with Kapor Foundation, Consumer Reports produced BAD INPUT, a three-part video series that profiles individual experiences and research from experts, who distill nuanced topics into bite-sized information so people can learn the risks and collectively urge companies to end these harmful practices. BAD INPUT compiles a diverse mix of perspectives from different fields, including medicine, finance, law, and technology.
The series addresses three areas where algorithmic bias is present:
“Medical Devices” explores how poorly-designed sensors made the pandemic more deadly for people of color. Healthcare professionals rely on pulse oximeters to measure oxygen levels, but the devices are less accurate for patients with darker skin.
“Mortgage Lending” delves into discriminatory redlining practices that have prevented generations of people of color from accumulating wealth typically facilitated by homeownership. Years later, computer algorithms may continue to perpetuate these biased practices.
“Facial Recognition” explores the implications of widespread digital surveillance enabled by our connected consumer devices, and what happens when innocent people are caught up in digital dragnets.
CR created a landing page, badinput.org, where anyone could watch the films and help take action to combat discriminatory practices.
To get the word out about the launch of BAD INPUT, CR hosted film screenings in multiple cities, reached out to media, and promoted the films on social media. We partnered with organizations including the American Civil Liberties Union, Girls Who Code, Mozilla, All Tech is Human, Algorithmic Justice League, National Fair Housing Alliance, and The Markup to reach individuals most likely to be impacted by these practices and we encouraged people to act by demanding algorithmic audits and accountability for stronger consumer protections.
We launched a petition and invited thousands of people to join us in urging companies to stop algorithmic bias.
One challenge we faced is that discrimination can be hard to see and understand. For instance, you might get rejected for a mortgage and not know you’ve been discriminated against because of an old redlining practice. Individuals have no insight into how data is used because of the lack of transparency. To overcome this challenge, we used video in the films that would help give people visual insight into it and broke down the issues to be easily understandable. We then amplified this message with interviews with local and national media, including podcasts and Spanish-language outlets.
What makes the BAD INPUT film series unique is that it brings together subject matter experts, advocates, organizations, and individuals to address algorithmic bias and its impact on people’s everyday lives. CR is uniquely positioned as an independent nonprofit that draws from its rigorous testing and research to expose the hidden dangers that affect society at large. In our approach to social issues and impact, we aim to both educate and offer the public ways to raise their collective voice and push for change.
The launch of BAD INPUT and the public education campaign around it raised awareness of algorithmic bias and empowered people to take action.
More than 15,000 people have joined Consumer Reports’ petition to urge companies to stop algorithmic bias.
We have continued to engage this group of passionate individualsand spur them to take actions such as contacting their senators to support the Algorithmic Justice and Online Platform Transparency Act.
More than 1,000 people attended our events held with partners including Mozilla, All Tech is Human, Kapor Foundation, and RightsCon.
The launch of BAD INPUT and the issues it addresses were covered by 40 TV stations across the U.S., including Spanish-language outlets such as Telemundo Chicago and Telemundo Phoenix and community television such as BronxNet. CR experts gave live interviews about the video series.
The films have received 2.5 million views on social media, and there have been 26,000 visits to the BAD INPUT website.