How Can We Become Advocates Against Coded Bias?
Now on Netflix, Coded Bias is an American documentary film that premiered at the 2020 Sundance Film Festival.
IT WAS DIRECTED BY SHALINI KANTAYYA AND FOLLOWS RESEARCHERS AND ADVOCATES, PRINCIPALLY MIT COMPUTER SCIENTIST JOY BUOLAMWINI AS THEY EXPLORE HOW ALGORITHMS ENCODE AND PROPAGATE BIAS.
The documentary unveils the many biases that exist within technology systems that we are exposed to in our everyday lives. It’s an eye-opening experience detailing just how damaging technological bias can be, from vetting loan offers to job applications and inaccurately profiling criminals based on facial recognition.
OUR EDUCATOR FACEBOOK GROUP (RISE) HOSTED A SCREENING AND GROUP DISCUSSION THAT DOVE INTO THE WAYS THE FILM’S THEMES CAN INFORM CLASSROOM TEACHING.
Perhaps the biggest takeaway from the documentary is the sheer size of technological influence on our everyday lives and how, when used incorrectly or inequitably, it can alter a person’s life forever.
One RISE community member discussed how their perception of Artificial Intelligence was that it was meant to serve as an unbiased and simple technological tool, yet this is not the case.
“The past dwells within our algorithms.”
- Cathy O’Neil, author Weapons of Math Destruction
The group interpreted this quote to mean that the biases that exist(ed) in society are being replicated in technology. Just look at the profiles of the leading technology inventors. As facial recognition becomes a powerful tool, it also comes with harmful biases against people of color, informed by conscious or unconscious biases from those who created those algorithms. As a result, Ghanian-American MIT computer scientist Joy Buolamwini had to put on a white face mask to complete her facial recognition studies. She is now the founder of the Algorithmic Justice League.
Another troubling scene in the documentary shows how a high school teacher is examined using Artificial Intelligence software to determine their qualifications. Despite his largely successful and decorated teaching career, his job was threatened by the findings of a biased job-vetting system.
HOW EDUCATORS THINK WE SHOULD HANDLE CODED BIAS GOING FORWARD
Our RISE community members agreed that to best prepare students to be advocates for themselves if technological bias strikes, one must be approachable and vulnerable to the topic. Giving examples and discussing ways in which one may have experienced technological bias is a way to connect with and have compassion for students. If there is space for it, allowing time for critical thinking and discussions about ethics in technology courses would serve to educate students about the algorithmic world that is being created around them, largely without their input.
We all recognized that not being aware of these issues is a result of not having been immediately exposed to the problem or being told about it. It is time we bring these stories to the table so that we may change the course of our codes.
Here at I Am A Scientist, we’ll be thinking of ways in which we can offer resources or guidelines to help you in your conversations about technological bias with your students.
If you’d still like to watch Coded Bias, it is now playing in virtual cinemas which can be accessed here.