May 04, 2017 09:33 AM EDT
Artificial intelligence and machine learning have progressed in leaps and bounds making way for new technologies to develop. AI has greatly benefited research that has made and is making human lives better. However, there is a dark side to this because artificial intelligence can be used to it commit sophisticated fraud.
Researchers at the University College London developed an AI algorithm that can copy a person's handwriting. All the AI needs is a few lines of a person's handwriting to replicate it. The main goal of the tool is to help stroke victims or to translate comic books while preserving the author's original handwriting.
However, if the technology falls into the wrong hands, it can be used to forge legal and financial documents. It can even be used to forge the handwriting of famous people in history. Although there are forensic experts who will be able to distinguish the fake from the real, it might become much more difficult in the future as the technology becomes more refined.
Lyrebird and Google Wavenet are similar technologies that synthesize human speech into anyone's voice and the result is uncannily real. While Lyrebird's speech is rudimentary, Google uses neural networks. Lyrebird warned that copying another person's voice is possible in the near future which will make audio voice recordings an unreliable source of evidence.
Artificial intelligence has created chatbots that provide more and more natural experience. One example of this is a messaging app called Luka which created bots that sound and talked like the fictional characters from the HBO hit series Silicon Valley. What the company did is feed the first two season of the show to the app's neural network.
The next step the company did was resurrect their dead CEO by feeding the neural network with the late CEO's social media interactions and other sources of information. Before long, the AI started to talk and speak like Luka's dead CEO.
Combined together, these three technologies can become a lethal tool which will make data unsafe and guard it becomes a very critical part of the society.
© 2017 University Herald, All rights reserved. Do not reproduce without permission.