Facial analysis tool on an iPhone for an iPhone user to create a cartoon of themselves

Facial Analysis

Posted

in

by

A.I. systems are shaped by the priorities and prejudices — conscious and unconscious — of the people who design them, a phenomenon that I refer to as “the coded gaze.”

Joy Buolamwini talks about in her article the ineffective use of AI technology on darker complexion skin tones. Reading this article, it was sad to come across a piece like this of an issue I had no idea before this class existed, or ever did at one point exist. But coming across this piece it makes me more aware and have a better understanding on just how imbedded racism actually is, and it goes far beyond just the normal human interaction prejudices that exist today. It’s sad actually that because of the color of your skin, a tool that is suppose to benefit you, doesn’t benefit you at all and except a reminder of the continuous prejudice that exists today. Isn’t technology suppose to make our lives essentially easier, not traumatize us? Even though it continues to do so.

When reading this article, it brought me back to the class discussion is Dr. Friends class on how all of technology that is made today, if it’s made by a white man/woman, it’s going to work the best on white people. I never realized this, but once we dived further into conversation of the camera invention and the way it never was able to work or at least capture Black people, it become apparent that even technology can be prejudice as well because it is made by, most likely, a prejudice person.

A global issue today of racism, it is sad to see that racism is able to translate through the tools we use today. From Noble’s book of algorithms to Buolamwini’s article on facial analysis and the way it is also racist too. There is a bigger issue at scale other than privacy and security, the other issue is how can we make technology less prejudice and racist? Guess we have to start at the source first.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *