Artifical Intelligence

Dangers of AI

Posted

in

by

I did not expect this week’s reading to be so short. But from the few paragraphs I have read, and also from my own research for my presentation, I have come to the conclusion that AI’s are inherently racist. Yes, a non-living thing cannot formulate having bias and being prejudiced. But the programs are being coded by biased human beings after all. As much as we read and understand, we see that companies are trying not to produce technology that holds any kind of discriminatory components. But are failing to do so.  In Joy  Buolamwini’s “ The Hidden Dangers Of Facial Analysis: [Op-Ed]”, she mentioned this term called “The coded gaze”. The coded gaze from my understanding is essentially used for the programmers with the power who create technological advancements and have biased views embedded in said advancements. We are faced with companies who are aware that their products are being discriminatory towards groups who are already oppressed. Yet, they continue to embed their own viewpoints in the system. In Joy Buolamwini’s article she had also said “The risks of biased facial analysis technology extend beyond hiring. According to the Center on Privacy and Technology at Georgetown Law, the faces of half of all adults in the United States — over 117 million people — are currently in face recognition database networks that can be searched by police departments without warrant” (2). Once again, we are being surveilled without our knowledge. This also is very similar to individuals with Islamic last names who are trying to get on a plane. These individuals have no control over what the system produces. They literally just have to listen to the TSA workers when they are being basically profiled. It’s easier said than done, but things must change. Just like Buolamwini had said, we need to bring more attention to this matter before more social inequality starts taking place. 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *