Article
Partner Updates
Industry News & Trends
Google publishes data on their efforts to combat CSAM
Google, a valued partner of INHOPE since 2015, uses a combination of automated detection tools and specially trained reviewers to identify Child Sexual Abuse Material (CSAM) on their services. Once CSAM is discovered, they may respond in a number of ways:
- By reporting the content to the US hotline, the National Centre for Missing and Exploited Children (NCMEC) who then assess this content for legality, and refer it to Law Enforcement and other hotlines in the INHOPE network so that it is removed.
- By disabling the user’s account so that they can no longer share CSAM.
- By removing the URL containing CSAM from their search index so that it cannot be found via Google’s services.
- By creating hash values of any new CSAM they discover so that automated detection tools can identify and remove it should it reappear online in the future.
Google has provided data relating to each of these responses in their recently released transparency report.
Zooming in on just one of the figures provided, Google reported 2,904,317 pieces of content to NCMEC in the second half of 2020. This figure is nearly double that for the first half of 2020, where 1,533,536 pieces of content were reported to NCMEC.
It is hard to comprehend numbers like this, but it is important to remember that these numbers correspond to the sexual abuse of real children. While this number may include duplicates and false reports (pieces of content which are reported more than once, or which are not CSAM), each report made to NCMEC means the revictimization of a child can be stopped.
INHOPE is proud to be working with Google, and many other technology organisations who are taking the removal of CSAM from their services seriously.
Read about what another of INHOPE's partners, Microsoft, is doing to fight CSAM here.
If you'd like to read more articles like this, then
click here to sign up for INHOPE Insights and Events.