Google Software will help detect child sexual abuse on the internet

0
6

Google’s artificial intelligence would help human moderators become exponentially more effective at erasing sexual content from children.

The software with this capability already existed before but it was somewhat limited. Technology such as Microsoft’s ‘ PhotoDNA ‘ allows you to give alerts on photos and videos that have already been identified before. However, a human moderator was required to have previously catalogued a material such as CSAM for AI to identify. As a result, all content that has not been marked before as illegal is beyond its reach.

Google poses a more effective solution. The company intends to use neural networks to scan a large number of images in a short time. At your discretion, you can mark the material with the highest priority of revision. This will help the human moderators to give a faster verdict and eliminate the content in an agile way if it is CSAM. In other words: less time, more material than moderators can mark and eliminate.

This will be possible thanks to the complex implementation of Google machine learning. From having analyzed thousands of images, IA will be able to differentiate one from another. According to the company, the moderators will increase their efficiency by 700%.

Google clarified that the software will be available through its Content Security Programming Kit: “We are putting this at the disposal of NGOs and industry partners for free through our content security API, a set of Tools to increase the ability to revise content in a way that requires fewer people to be exposed to it.”