Google offer AI tool to identify child sex abuse images

Published on September 14, 2018

It is seen that numerous organizations have been seen to take the task of pedophilic images on the internet. But to do this, it is very difficult as well as challenging to review the vast amounts of the horrific content. Now it seems that the tech giant is taking a great step to deal with it.

Google, AI, Artificial Intelligence

Google has now launched an AI toolkit which will be helping the organizations to review about vast amounts of child sex abuse material which is said to be quick as well as they need to minimize the need of the human inspections. It will be done by taking the help of the deep neural networks which will be scanning the images for any abusive content as well as prioritize the candidates for review.

The tech solutions in this domain are done by working by checking the images as well as videos which is said to be against the catalogue of the previously identified abusive material. This is the kind of software which is known as the crawler. This software is said to be an effective way to stop the people from sharing the previously identified CSAM. But the disadvantage is that it cannot catch the material which is already marked as the illegal. To identify them, human moderators will be reviewing them.

At this point, it is said that the Google’s new AI tool will be helping them. This software will be assisting the moderators by sorting the flagged images as well as videos that can prioritize the CSAM content for the review. This will allow much quicker review process. As per a post by Google, it said that the new AI tool had helped the moderator to take action on about 700 per cent which is more CSAM content which is over the same period.

According to Fred Langford, who is the deputy CEO of the Internet Watch Foundation (IWF) said that the software would help the team like the own deploy in the limited resources which is said to be more effective. At that moment, we can say that this will use the humans purely to go through the content which will help in triaging.

IWF is said to be one of the largest organizations who dedicate to stop the spread of CSAM online. It is based in the UK, but they have funded by the contributors who are said to be big international tech companies that include Google. It also employs human moderators to identify the abuse material. The organizations also carry out its investigations that identify the sites in which the CSAM is shared, and it shuts them down.

Enjoyed this video?
Google
"No Thanks. Please Close This Box!"