Academics

Artificial Intelligence System Learns Racism and Bigotry [Video]

By

Artificial intelligence is easily believed as highly logical and objective. However, a new study shows that even AI has the biases that its human creator has. These subjectivities range from the mundane color preferences to more sensitive issues like gender and race.

Princeton University's Center for Information Technology Policy faculty member, Arvind Narayanan said it is important to identify and address the biases present in machine learning since people are now turning to computers to do a lot of activities like communication, online searches, and more, Eureka Alert reported. Narayanan, who is also an affiliate scholar at Stanford Law School's Center for Internet and Society, said these artificial intelligence systems may have acquired the socially unacceptable bias that humans are trying to move away from.

The paper published in "Science" April 14 is called "Semantics derived automatically from language corpora contain human-like biases." It's lead author, Princeton University's Aylin Caliskan, and her team adapted the Implicit Association Test to carry out the textual analysis tool called Word-Embedding Association Test, Geek Wire reported. The system looks at how given words are associated with the other words surrounding them to determine whether it is embedded with an unpleasant or pleasant connotation.

The analysis showed that flowers are better than insects, and musical instruments are better than weapons. These are mundane and ordinary biases. The researchers analyzed 2.2 million Eurpean-American and African-American names, and found out that AI regard European_American names as more pleasant.

When it comes to genders, the researchers found that AI usually associate female with domestic words, like "family" and "wedding," while male is associated with career words such as "salary" and "profession." The study also revealed that female words were embedded with arts, while male words are associated with science and math. This study shows that artificial intelligence is not objective and unbiased as people believe it to be.

© 2024 University Herald, All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics