INHOPE | What is child sexual abuse material detection?
Article
Educational Articles

What is child sexual abuse material detection?

Technology companies and online communication services use automated detection to voluntarily identify, evaluate, report, and reduce the scale of harmful content found on their platforms.

Detection is viewed in two distinctive ways: automated detection and proactive search. What we are referring to and supporting is the use of automated detection, which has been approved by the European Parliament through the passing of the E-privacy directive and allows for technology companies to continue detecting and removing online child sexual abuse material (CSAM) using approved hash lists.

References to detection are often misunderstood due to the differences between automated detection and proactive search. In this brief article, we are going to explain what each means.

What is detection?

Child sexual abuse detection is when organisations use hash lists to review their platforms for existing CSAM and list unknown and classified CSAM.

A hash is a long number that is created when a mathematical hash algorithm is applied to a file. This number is unique to that file just like a fingerprint. The hash does not indicate the content or the data contained in a file and the hash number cannot be used to recreate the original file or the data. Law enforcement creates hash lists of previously investigated and confirmed CSAM. These lists can then be used by industry to identify known CSAM on their services using automated detection without having to breach any data privacy regulations.

Automated detection using hashing technology is a key tool against fighting online CSAM as it can identify content on a scale that would otherwise not be possible. For example, Microsoft’s PhotoDNA is a hash fingerprint technology that allows them to detect data on their services for CSA material without breaching their customers' privacy rights. When matched with a database containing hashes of previously identified illegal images, PhotoDNA is an incredible tool to help detect, disrupt, and report the distribution of child exploitation material.

Detection vs. scanning

Sometimes the word scanning is mistakenly used, however, we abstain away from this because of the negative association. This insinuates that a user’s confidential information is being scanned without their consent. Instead, what is taking place is detecting CSAM as opposed to a breach of one’s privacy.

Statistics show that proactive search leads to a substantially higher number of identified CSAM. For example, WhatsApp proactively scans unencrypted information such as profile and group photos for child exploitative imagery (CEI). Due to this WhatsApp bans more than 300,000 accounts per month. In recent years, there has been an increase in proactive search efforts taken by hotlines where the national jurisdiction allows it. Among INHOPE’s network the Internet Watch Foundation, UK is active in the proactive search of CSAM online.

The use of automated detection and proactive search technology has resulted in the identification and reporting of tens of millions of online CSAM as well as helping support police investigations, rescue victims, and identify offenders for prosecution.

If you’d like to learn about more topics like this, sign up here to get the latest news from us.

What is child sexual abuse material detection?
14.12.2021 - by INHOPE
Photo by INHOPE