Tech

Siri Mistakenly Led Prostitute Seekers To eSports Bar; Voice Commands Instill Rude Language To Children

By

It turns out that even artificial intelligence can mishear things. Apple's voice assistant, Siri, mistakenly led escort service seekers to an eSports bar.

An eSports bar named Meltdown Toronto at 686 College St., Toronto, Canada got a steady stream of people inquiring for escort service due to Siri's blunder. The bar's co-owner, Alvin Acyatan, recounted that one caller even explicitly asked for "prostitutes" from the establishment, Global News reported.

Acyatan was previously indifferent to the calls, attributing them to wrongly dialed phone numbers. However, he was forced to investigate the whole thing when the whole situation got too bizarre and the calls looking for escort services didn't stop.

Finally, one caller provided an answer to Acyatan's worries. Apparently, Siri mistook "escorts" for "eSports" because the two words' pronunciations resemble each other's. Acyatan believes that the voice assistant may have searched the internet and found their eSports bar as the nearest match to "escorts."

What exactly is an eSports bar, though? These bars cater to competitive video gaming. Weekly tournaments are held in these places, with the matches usually streamed online.

Meltdown Toronto has reached out to Apple via Twitter to fix the issue. The tech giant hasn't responded to the bar's concern yet.

This isn't the first time that a voice assistant misheard user's inquiries. In the United Kingdom, Amazon's AI, Alexa, misheard a little boy's request of "play 'Digger Digger.' Alexa responded with a stream of pornographic words and terms because it mistakenly matched the children's song to an explicit Spotify track.

In November 2016, Google Assistant came under fire after it featured a sketchy WordPress blog when asked "who won the popular vote?" The numbers displayed by the 70 News WordPress blog was inaccurate, and it was alarming back then because the site was at the top of Google's search results, The Verge reported.

Voice assistants are worrying artificial intelligence experts for instilling rude language to children. The kids apparently get used to demanding answers from gadgets without saying platitudes such as "thank you," "sorry" and "please," and they incorporate this behavior in their conversations with actual people.

Meanwhile, Apple may be planning big things for Siri. A concept render integrated the voice assistant to the iPhone 8's augmented reality technology.

The render showed the smartphone in a translucent body that showcases the scenery and objects behind it via the rear camera, according to Apple Insider. By pressing the in-display home button, users can ask Siri questions and issue commands based on the images being seen on the screen.

© 2024 University Herald, All rights reserved. Do not reproduce without permission.
Real Time Analytics