What is Encryption?
End-to-end encryption (E2EE) is a privacy feature used on many messaging services which makes it impossible for anyone except the sender or receiver to view the content of a message. It works by scrambling the data of a message once it has been sent, and requiring a decryption key to unscramble it. Only the sender and recipient of the message have the decryption key meaning not even the owner of the platform or law enforcement can access the data without access to the devices the message was sent or received on.
How does this relate to child sexual abuse?
The majority of online communication is innocent. However, we know from well known cases that individuals have used online messaging services to share Child Sexual Abuse Material (CSAM) or groom children online.
The expansion of E2EE across online messaging services makes the online spread of CSAM and grooming activities harder to detect. This is because current technologies used to detect these behaviours, including photo matching technologies, or machine learning tools which identify patterns of language, do not work on encrypted data.
Electronic Service Providers, including those which operate messaging services, accounted for 21.4 million of the 21.7 million reports of CSAM received by NCMEC in 2020. NCMEC fear that this number will be drastically reduced if more messaging services implement E2EE on their services. Mark Zuckberg's announcement that Facebook, who were responsible for over 20.3 million of these reports, will also be moving towards E2EE is of particular concern across the child protection community.
E2EE also creates challenges for Law Enforcement in gathering data to prosecute an individual or in obtaining a warrant to search an offender’s device. In a 2019 study conducted by NetClean , nearly half of the surveyed police officers reported that encryption is the biggest challenge they face in child sexual abuse investigations.
What solutions are available?
Encryption provides a valuable tool for protecting individual privacy. However, many fear that the expansion of E2EE tips the balance between privacy and safety in a way that protects the privacy of adult abusers at the expense of children’s safety. This is a highly nuanced debate, but many argue that privacy need not be placed in opposition to safety in this way.
There are three kinds of solutions to this problem, as outlined in a report by the NSPCC:
Most image hashing technology which is currently used to detect CSAM works on the server. For E2EE enabled platforms, this method happens too late because the data has already been encrypted. If hashing technology was built into the device itself, then CSAM or grooming could be detected before the data was encrypted.
This approach offers a high level of privacy, but risks users subverting or reverse-engineering the detection tools on their devices.
Server related solutions:
Another option is to give platforms or law enforcement a “back-door” to encrypted services, allowing them to decrypt messages. However this is not a feasible solution to apply at scale and heavily impacts the privacy which E2EE is championed for.
NSPCC suggest a better option would be to have this decryption occurring in a ‘secure enclave’ on the Cloud. This maintains privacy but is no longer strictly end-to-end encrypted.
Homomorphic encryption technology can perform image hashing to detect CSAM on encrypted data.
More development on the technology is required because it is currently too slow for images and infeasible for videos.
If you'd like to learn more about topics like this, then
click here to sign up for INHOPE Insights and Events.