INHOPE | Artificial Intelligence in the fight against Child Sexual Abuse Material - Part 2
Article
Events & Campaigns
Industry News & Trends

Artificial Intelligence in the fight against Child Sexual Abuse Material - Part 2

Last week we looked at how fighting crime is changing in the 21st Century. This week we’re exploring the benefits of using Artificial Intelligence (AI) in the fight against CSAM. Does your business also use AI? What challenges have you experienced? Get involved in the conversation and help us get closer to our goal of an internet free of CSAM.

How is AI being used in the fight against CSAM?

Many global brands are using AI on their systems to facilitate a faster and more effective removal of CSAM than has ever been achievable before. As one example, PhotoDNA technology, originally developed by Microsoft, enables multiple uploads of illegal content to be recognised and removed despite it undergoing minor alterations which would previously have made the image unrecognisable to a computer. This enables multiple versions of an image to be removed from a platform on mass, rather than each instance requiring individual analysis by a human.

Do you know of other technology being used to detect illegal content? What were some of the obstacles when developing it? What does this technology resolve?

One of the biggest challenges for using AI to detect CSAM is training it to recognise new and previously unseen CSAM. The machine needs to be exposed to huge quantities of images and videos so that it can identify patterns and characteristics of the material. However, storing and sharing CSAM, even if only to help remove more in the long run, gives rise to all sorts of legal and ethical problems.

One solution is creating specific and highly secure institutions where academics and engineers can use data for training purposes. This is already taking place at the premises of one scientific research institute and at the premises of law enforcement offices but it’s not perfect – it still requires extra people being exposed to the content, coming at a cost to the individuals who witness it, and the victims of the abuse.

Do you know of a better way this challenge could be overcome? What other challenges associated with the use of AI to detect CSAM do you know of?

Get in touch at communications@inhope.org and become a part of the collaboration between industry, hotlines, and law enforcement fighting CSAM.

Click here to read about the impact of law and policy in the fight against CSAM.

*Source: INHOPE Summit, Breakout Room One

Artificial Intelligence in the fight against Child Sexual Abuse Material - Part 2
19.11.2020 - by INHOPE
Photo by INHOPE
'

If you'd like to read more articles like this, then
click here to sign up for INHOPE Insights and Events.

'