Monday, Oct 16 2017 | Updated at 09:42 PM EDT

Stay Connected With Us F T R

May 03, 2017 07:06 AM EDT

Tech Firms Around The World Race To Develop AI Software That Will Detect Video Violence [VIDEO]

Close
'Pharma bro' Martin Shkreli jailed after Facebook post about Hillary Clinton's hair
Preventing Video Violence
The Facebook app logo is displayed on an iPad next to a picture of the Facebook logo on an iPhone on August 3, 2016 in London, England.

(Photo : Carl Court/Getty Images)

Following the Facebook Live video of a Thai man killing his 11-month old daughter, the pressure does not fall on Facebook's shoulders alone. Most tech firms from all over the world are in a race to create AI software that will detect video violence before it goes viral.

Companies from Asia to Europe are racing to improve artificial intelligence in software so that it will stop a similar thing from happening again. Google and Facebook are also working on their own solutions since both are experiencing the same problem with their services.

Most of the companies, like Sightengine, a Paris-based image and video-based analysis company, propose the use of deep learning which utilizes computerized neural networks. According to David Lissmyr, CEO and founder of Sightengine, this method had been used way back in the 50s where machines copy how neurons interact with the brain.

Matt Zeiler, the founder and CEO of New York-based tech company Clarifai, said that the available data and computing power to teach these systems are enough to enable them to become more efficient and accurate.

The teaching process starts with feeding the machine with images that are associated with violence, such as the violent scene in a video, a knife, a gun, hacking actions, or blood. However, there are limitations because the software can only identify what it was trained on.

Abhijit Shanbhag, CEO of the Singapore-based tech company Graymatics, said that people can become more creative with how they perform violent acts. Thus, there is a need to teach these machines regularly to update them.

Aside from this, violence on video that does not involve any weapons or blood can be difficult to identify. So is psychological torture.

One of the solutions to these limitations is to use an algorithm that detects how viewers react to a certain scene. Another indicator could be the student increase of reports and shares.

In order to see these changes in place, there must be strict regulations to companies that are involved in user-generated content.

© 2017 University Herald, All rights reserved. Do not reproduce without permission.

Join the Conversation

Get Our FREE Newsletters

Stay Connected With Us F T R

Real Time Analytics