Tech companies collaborate at INHOPE Summit 2020
“It started with a ‘James’ who made a report to Red Barnet”. Representatives from the National Center For Missing and Exploited Children (NCMEC) opened the session by recalling a past case where a single report became a worldwide investigation that spanned continents and led to the arrest of 30 people.
Packed with presentations from industry experts, and panel discussions, the third INHOPE Summit brought together like-minded individuals from around the world to focus on how we can support all the 'James' of the world and achieve our goal of an internet free from Child Sexual Abuse Material (CSAM).
NCMEC contextualized the issue by providing insight into concerning trends they've been seeing in the reports of CSAM they deal with every single day. They note that continued victimization is rampant – some victims continue to appear in hundreds of thousands of files from abuse that occurred many years ago. During COVID-19 times, they’ve seen a spike in reports as certain content goes viral. Some online groups thank they’re trying to help children by sharing illegal content, but they’re actually harming victims.
Microsoft's Chief Online Safety Officer spoke on industry's perspective of this fight: “It is in industry’s best interest to protect their customers and the integrity of their services by reporting illegal content and shutting down offenders’ accounts”.
The challenges faced by industry in tackling CSAM were also brought to light. On the tech & engineering side, testing new technology is difficult as the data needed for testing is contraband given the nature of the content we’re looking to detect and remove. There's also the challenge of walking the line between privacy and effective online child protection.
Google recognises there’s no one way to fight CSAM online, so they use multiple layers of protection. Google goes about this by: removing known CSAM hashes, protecting results for risky queries, detecting never-before-seen CSAM, and deterring predatory associations.
Crisp and Trend Micro offered advice to companies looking to join the fight. Trend Mirco recommends companies think about where they fall in the ecosystem and what their capabilities are or role can be. If they have budget to spare, Trend Micro believes helping support your local hotline or INHOPE goes a long way. Crisp believes they shouldn't try and rebuild the current technology. We have gaps in collaboration and data, work to fill these gaps instead of reinventing the wheel. We need to leverage the knowledge and information we already have. Crisp to employees: "If your organisation says it will join the fight against CSAM and child abuse, hold them to account. You may not stay at the organisation forever, but you can absolutely leave a legacy".
These were some of the key takeaways from discussions during the Summit:
- We need to train data sets with an eye towards the global nature of this crime.
- The importance of child participation cannot be overstated.
- There would be no CSAM online without viewers, no supply without demand. We have to also focus on reaching out to viewers and perpetrators and offering them education, therapy, and treatment.
- And lastly, we can't allow people to paint this topic as ‘too taboo’ or ‘unsavory’ - we need to be comfortable getting uncomfortable in order to properly fight CSAM.
Over 160 people joined this multi-stakeholder forum that opened the discussion on a taboo topic, all bringing knowledge from their specific field and locality. Of the 100+ attendees, we would love to see every single person become even more active contributors in this fight. We cannot win this fight alone.
This event simply marks an opportunity to have an open forum discussion. You don't have to wait until the next summit in 2021 to join this discussion - you can reach out to get involved at any point. If you're interested in getting more involved, leave us your details here to receive details on upcoming events.
Photo by INHOPE
It is in industry’s best interest to protect their customers and the integrity of their services by reporting illegal content and shutting down offenders’ accounts'