Monday, May 06 2024 | Updated at 09:59 PM EDT

Stay Connected With Us F T R

Apr 16, 2017 02:08 AM EDT

Artificial intelligence is easily believed as highly logical and objective. However, a new study shows that even AI has the biases that its human creator has. These subjectivities range from the mundane color preferences to more sensitive issues like gender and race.

Princeton University's Center for Information Technology Policy faculty member, Arvind Narayanan said it is important to identify and address the biases present in machine learning since people are now turning to computers to do a lot of activities like communication, online searches, and more, Eureka Alert reported. Narayanan, who is also an affiliate scholar at Stanford Law School's Center for Internet and Society, said these artificial intelligence systems may have acquired the socially unacceptable bias that humans are trying to move away from.

The paper published in "Science" April 14 is called "Semantics derived automatically from language corpora contain human-like biases." It's lead author, Princeton University's Aylin Caliskan, and her team adapted the Implicit Association Test to carry out the textual analysis tool called Word-Embedding Association Test, Geek Wire reported. The system looks at how given words are associated with the other words surrounding them to determine whether it is embedded with an unpleasant or pleasant connotation.

The analysis showed that flowers are better than insects, and musical instruments are better than weapons. These are mundane and ordinary biases. The researchers analyzed 2.2 million Eurpean-American and African-American names, and found out that AI regard European_American names as more pleasant.

When it comes to genders, the researchers found that AI usually associate female with domestic words, like "family" and "wedding," while male is associated with career words such as "salary" and "profession." The study also revealed that female words were embedded with arts, while male words are associated with science and math. This study shows that artificial intelligence is not objective and unbiased as people believe it to be.

See Now: Covert Team Inside Newsweek Revealed as Key Players in False Human Trafficking Lawsuit

Follows artificial intelligence, Human Bias, objective, gender, racist
© 2024 University Herald, All rights reserved. Do not reproduce without permission.

Must Read

Common Challenges for College Students: How to Overcome Them

Oct 17, 2022 PM EDTFor most people, college is a phenomenal experience. However, while higher education offers benefits, it can also come with a number of challenges to ...

Top 5 Best Resources for Math Students

Oct 17, 2022 AM EDTMath is a subject that needs to be tackled differently than any other class, so you'll need the right tools and resources to master it. So here are 5 ...

Why Taking a DNA Test is Vital Before Starting a Family

Oct 12, 2022 PM EDTIf you're considering starting a family, this is an exciting time! There are no doubt a million things running through your head right now, from ...

By Enabling The Use Of Second-Hand Technology, Alloallo Scutter It's Growth While Being Economically And Environmentally Friendly.

Oct 11, 2022 PM EDTBrands are being forced to prioritise customer lifetime value and foster brand loyalty as return on advertising investment plummets. Several brands, ...