Review: Coded Bias Examines Racism Programmed into AI and the Far-Reaching Impact of Flawed Facial Recognition Software

An official selection at this year’s Sundance Film Festival (yes, there was one of those back in January), Coded Bias brings to light an aspect of artificial intelligence that doesn’t get as much attention as those that see computers winning “Jeopardy” or solving equations that would take humans millennia to complete. The issue is in a given algorithm’s ability to perceive human faces (known, appropriately, as facial recognition) of varying races; study after study concludes there is an inherent bias in these types of software, able to discern features of white men nearly all of the time (and white women most of the time) while Black citizens, particularly those with darker skin, are barely recognized, if at all. Since computers are only as intelligent as their programmers, it goes to a deeper bias in technology, an industry largely built (like the rest of the Western world) by white men for white men.

Coded Bias

Image courtesy of Music Box Theatre

Filmmaker Shalini Kantayya (Catching the Sun) follows MIT researcher Joy Buolamwini, who explains how she discovered the issue when she began building an interactive mirror, one that would super-impose filters on your face like Snapchat does. The problem was, the software Buolamwini used for facial recognition couldn’t see her dark-skinned visage; it only recognized her as a person when she put on a blank, white mask. It’s troubling to say the least, and something more people than we realize encounter on an all too often basis. Twitter was recently in hot water for an apparently racist algorithm it used to crop images in tweets for display in someone’s feed; tall vertical images with a white person at one end and a black person at the other were always cropped with the white person visible in the preview, regardless of positioning. It’s not okay by any means, but also…it’s just Twitter.

The issue becomes far more significant when applied, as the documentary goes on to explain, to facial recognition software in use by police forces in the US, the UK and elsewhere. In London, activists keep vigil next to a police van with facial recognition cameras scanning every pedestrian, delivery person and commuter making their way through a certain neighborhood. When the police actually stop a teen student (yes, he’s dark-skinned) under the guise of a hit on the recognition software, the activists immediately interject on his behalf to make him aware of his rights. It’s just one example Kantayya includes in a film filled with myriad ways this technology touches our lives every day (do you have FaceID active on your mobile phone?). Eventually, the film zooms out to discuss both the global impact of this type of software (China’s “social credit score” comes up) and the ways in which technology continues to outpace society’s ability to understand and, where appropriate, regulate it.

With a finely honed sense of economy (the film is a crisp 90 minutes long), Kantayya weaves together Buolamwini’s experiences—from discovering the issue to testifying before Congress about it—with interviews with experts in technology and anecdotes about this “smart” technology gone terrible wrong (the bit about one AI “learning” from the public—and how quickly that went south—is a treat). Coded Bias is at its best when, as it does often, it helps each of us understand just how far-reaching the technology is and just how much we (both the common citizen and those in technology, government and law enforcement) don’t know about it. It’s a fascinating look at a very real aspect of our day-to-day lives, some far more than others in far more detrimental ways. Smile, you’re on facial recognition software!

Coded Bias is now streaming in virtual cinemas, including at Music Box Theatre. A portion of your rental goes to support the cinema while it’s closed.

Did you enjoy this post? Please consider supporting Third Coast Review’s arts and culture coverage by making a donation. Choose the amount that works best for you, and know how much we appreciate your support! 

Categories: , ,

Tagged as: ,

Leave a Reply

Your email address will not be published. Required fields are marked *