🐱Andrew Chrzanowski🐱’s review published on Letterboxd:
☆"When I put on the white mask: detected. When I take off the white mask: not so much."☆
Film Independent screener.
Most believe that traits like racism and discrimination aren't inherited. They're taught, groomed, or developed from poisonous environments. They seep into our psyches subconsciously but only after years of being crafted from the media. They're products of family education and misguided perceptions of early life. Though what about technology or artificial intelligence? That's objective, fact-based, or -- pardon the pun -- black and white.
But what about the creators of those algorithms? They're human, and subject to the same influences as anyone else, and from this fact comes their underlying behaviors and feelings, embedded within the background of these programs. Director Shalini Kantayya's Coded Bias looks at this concept, beginning with the starting discovery of MIT Media Lab researcher Joy Buolamwini in her lauded piece "AI, Ain't I a Woman?" that facial recognition software cannot see dark-skinned faces with accuracy.
Sure, we laugh a bit at faulty tech like racist soap dispensers. That's funny, if also sad. But this kind of technology is increasingly a part of our daily lives, thus the fascinating research and concise look at AI like what's featured here, and the ever expanding use of these algorithms to the point of invasive and insidious. And what Coded Bias should be lauded for is the compressive manner in which this documentary peels back the veneer of optimism of all-encompassing technology and exposes its deep side effects and malignancies.
Buolamwini, PhD candidate and founder of the Algorithmic Justice League, found that she literally had to wear a white mask for a facial recognition program to see her at all. And she and other experts in this film detail how "narrow AI" is what actual AI is today, crafted by a faction of scientists and based on "average faces" of a homogenous group of people: White men. And these skewed pieces of data yield skewed results, "showing us inequalities that have been here," as Buolamwini says. Combined with the social science research of author Cathy O'Neil, much is revealed about the blind faith in Big Data that has consumed the security and technology industries.
The fears of a surveillance state aren't entirely unwarranted, say the experts in this film, not because the rich can afford the best tech but because the poor are inflicted with the punitive use of biometric cameras and facial recognition devices. Naturally, they aren't even very accurate. Working class Americans, people of color, and women have all been subjugated to mathematical decisions of machine models, from healthcare to education, employment to criminal justice, housing to free speech. And these algorithms -- while now inundated with data from billions of people every day from their own smartphones -- were first created by those in power for their needs and benefits.
Where the well-meaning but poorly-made Netflix documentary The Social Dilemma failed, Coded Bias succeeds, and much of that is due to its star of sorts in Buolamwini, who talks about her own career and the discrimination she's faced in her field. She has a presumption of being incorrect, being unintelligent, being uninformed, simply because of her gender and race. Her experiences are vital examples in the film.
But for this reviewer, the most damning case against algorithms is in the use of recidivism profiles, AI codes that label criminal offenders from high to low risk before, during, and after imprisonment. This was used against me, but also for me. My two year sentence was unquestionably shorter because of my name, race, education, while I met many men with similar charges who were in jail for a decade or two. You can guess why.
A nice documentary, despite much info that I had read before, but compiled in a very interesting way with dozens of examples that people from all sides of the political spectrum should absorb and consider. Sure, it's one-sided in the way it attacks technology, but it does so with regards to the free market and unregulated nature of this new normal. Though it's far from fear-mongering, and maintains a real focus on what to do with this information. Recommended.