INHOPE - Association of Internet Hotline Providers | A Recap of the INHOPE Summit 2025
Article
Events & Campaigns

A Recap of the INHOPE Summit 2025

On 14 and 15 October, INHOPE held its eighth annual Summit in Washington, D.C., bringing together technology companies, law enforcement, hotlines, and civil society under the theme Connected Crime, Collective Action. With participants joining both in person and virtually, the Summit explored collaborative strategies to combat online child sexual abuse and exploitation across jurisdictions, sectors, and technologies.

A Welcome from Google

Emily Cashman Kirstein, Child Safety Policy Lead at Google, welcomed attendees, highlighting the recent work of Google’s child safety team. Over the past year, the team partnered with the UK’s Royal College of Paediatrics and Child Health to finalise a comprehensive review of its pubertal status guidelines. These updated, clinical, and evidence-based standards aim to strengthen Google’s child sexual abuse material (CSAM) review process and set industry-wide benchmarks. By the end of this year, the guidelines will be added to Google’s Child Safety Toolkit – a resource offering eligible partners a machine learning classifier and video hashing capabilities to help streamline CSEA queues.

Kirstein stressed that collaboration with partners like INHOPE remains at the heart of Google’s approach to child online safety. “We cannot and should not do it alone,” she said, underlining the importance of events like the Summit for exchanging expertise and aligning efforts across sectors and jurisdictions.

Opening Remarks: A Call for Collective Action

Samantha Woolfe, INHOPE’s newly appointed Executive Director, opened the Summit with striking data showing the power of coordinated infrastructure: in countries with a hotline, the average CSAM takedown time is 1.4 days compared to 40 days in countries without a national hotline. “This contrast tells a powerful story,"she stated. "It’s what happens when coordination, trust, and shared standards exist. It shows that impact comes from infrastructure, and the INHOPE network, now 26 years old, is an example of this.”

INHOPE now represents 57 hotlines worldwide and has trained 221 analysts. CSAM can be found everywhere, and swift removal not only protects victims from re-victimisation but also reduces the circulation of material used for extortion, blackmail, and grooming. For the two days ahead, Woolfe urged participants to “think boldly, create, take action, and act together.”

An Interactional Approach to Understanding CSEAM

Professor Nuria Lorenzo-Dus from Swansea University and Universitat Politècnica de València, delivered a keynote reframing how participants should approach detection and prevention. “A picture is worth a thousand words, but words are also worth thousands of pictures,” she stated. CSAM is not simply content to be detected and removed; it is often the product of deliberate linguistic manipulation and systematic abuse. Context – including how material is requested, incited, and evaluated – is crucial for understanding these crimes.

Her research presented multilingual data on how child sexual exploitation and abuse material (CSEAM) is discussed in the context of online grooming, harmful sexual exchanges between children, and online offender communities. Integrating quantitative and qualitative language analyses reveals nuanced insights about how groomers manipulate children into sending CSEAM and how offender communities operate.

Data shows that offender communities incite material through established propaganda techniques such as repetition, loaded language, and name-calling. Members often justify sharing CSEAM through self-defined moral codes, creating a false sense of legitimacy. Over half of content evaluations in both offender and harmful child-to-child exchanges dehumanise and objectify children, and smaller percentages focus on violence or flattery. Lorenzo-Dus emphasised that offenders use a shared lexicon to evaluate children, which strengthens community bonds and fuels sexual gratification.

Understanding these linguistic patterns is essential because words drive behaviour, normalise abuse, and maintain offender networks. Integrating these insights into broader detection, prevention, and education strategies is key to creating safer online spaces.

The Rise of CSEAM & Its Intersection with Other Digital Crimes

Building on this foundation, a panel chaired by Leonardo Real, Chief Compliance Officer at Tether, brought together Elliot Chandler, CFO at Revontulet, Staca Shehan, Vice President of the Analytical Services Division at NCMEC, and Steven Grocki, Chief at the U.S. Department of Justice, to explore how offender motivations and digital environments are evolving.

Panellists identified a shift from sexual gratification alone to motivations linked to profit, ideology, or extremism. Some international groups, such as “764,” illustrate the connection between child abuse and hate-based violence. Commercial exploitation is resurging through digital payment systems, and technologies such as AI and encryption increase offenders’ capabilities.

Key takeaways included:

  • Recruitment occurs on public social media; extortion and payment are moved to encrypted apps and peer-to-peer systems.
  • Small, repeated transactions often fall below traditional AML (Anti-Money Laundering) thresholds.
  • Gift cards, gaming currencies and cryptocurrency are common laundering methods.
  • Financial monitoring could be enhanced by AI pattern analysis to flag aggregate suspicious flows.
  • Education of children, parents, and educators is critical to early recognition and reporting of grooming and sextortion.
  • Juvenile-on-juvenile cases require careful handling to identify manipulation behind seemingly “peer” exchanges.
  • Cross-industry collaboration is vital for intelligence sharing.
  • Transaction-monitoring frameworks used in counterterrorism or fraud can be repurposed for CSAM detection.


Industry-Wide Collaborative Initiatives for a Safer Online Future

The Universal Classification Schema
Abby Roberts, Project Manager at INHOPE, introduced the The Universal Classification Schema, a global framework designed to address the challenge of inconsistent classification of CSEAM across borders and legal systems. Developed in 2022 in cooperation with a multi-sectoral working group, it provides a shared language for professionals and supports certified tools and system integrations, streamlining workflows and enabling faster, more effective collaboration across jurisdictions.

“The goal of the Schema isn’t to change your workflow, it’s to help map how your systems relate to a universal standard,” Roberts explained. By standardising how material is categorised, the Schema allows law enforcement, hotlines, financial institutions, and tech companies to share intelligence efficiently, even when national definitions of illegal content differ. Since its launch, the Schema has received 500 access requests, trained 65 professionals, and is currently used by over 30 stakeholders worldwide.

Project Lantern
Sean Litton, President and CEO of the Tech Coalition, shared the latest progress of Project Lantern, highlighting how the initiative has advanced in connecting signals of online child sexual exploitation and abuse across companies.

“Many attempts span multiple platforms. Lantern helps us connect these dots.” Companies share signals such as hashes, URLs, or usernames linked to violations, enabling others to identify similar activity. In 2024, Lantern processed 296,336 signals, resulting in 7,048 CSAM items removed, 102,082 accounts actioned, and more than 12,000 URLs blocked. The Financial Sector Pilot with Block Inc., Mega, Meta, PayPal, Snap, and Western Union focused on disrupting financially motivated CSAM by enabling privacy-preserving data sharing between technology and financial companies. “We reduced the economic incentives behind CSAM,” Litton said, noting that legal and human rights assessments ensured compliance and privacy protection.

Levergaging AI & other Technologies to Detect Offenders & Criminal Networks

A panel, chaired by Sean Litton, President and CEO of the Tech Coalition and featuring Juliet Shen, Head of Product at Roost, Tyler Hand, Chief Compliance Officer at Block Inc., Jun Young Ro, Director of Trust and Safety Operations and Policy at VRChat, Frances McAuley, Director of Product at Resolver, and Brian Herrick, Vice President for Global Relationships and Strategic Partnerships at Thorn, explored the dual nature of AI- both as a threat and as a tool.

Offender networks now operate with organised efficiency, testing platform defences and exploiting gaps - AI plays a crutical role by enabling offenders to flood platforms with synthetic content. On the other hand, AI tools allow safety teams to connect signals and manage workloads more efficiently. Traditional hash-based detection does not work well against new material, making AI classifiers essential for identifying unknown CSAM.

Panellists emphasised the importance of prevention-by-design approaches and standardised, actionable intelligence for reducing incidents and improving global safety infrastructure coverage:

  • Detection strategies should focus on actor behavior and adjacent signals rather than payload inspection alone.
  • Law enforcement data should be operationalised to reduce investigative resources spent on synthetic content.
  • Smaller platforms must be equipped with open-source tools, localised interfaces, and playbooks for standardised reporting.
  • Industry frameworks should be evaluated for precision, timeliness, and actionability, and iterated when signals fall short.


Following the Money: Confronting the Economics of Online Exploitation

Chaired by Mick Moran, CEO of the Irish Internet Hotline, our next panel brought together Gavin Sather, Blockchain Investigator at Binance, Sally Frank, Anti-Human Exploitation Program Lead at Block Inc., and Carolina Christofoletti, CSAM Threat Intelligence Lead at TRM Labs, to discuss the re-emergence of commercial CSAM markets.

Criminals are finding new ways to profit from CSAM by creating online markets that include both genuine illegal trade and scams. Companies like Block Inc. monitor peer-to-peer transactions using keyword tracking to identify potential contact offences or grooming activity. However, without concrete evidence, reports often lack sufficient grounds for law enforcement follow-up, Frank explained. Non-delivery scams, where small payments are made for content that is never delivered, make detection even more difficult. Because these transactions are typically low in value, they often go unnoticed. Sather noted that the intent to buy still constitutes a crime, even when CSAM is not delivered, and that cryptocurrency exchanges can provide valuable data for investigations. Strengthening intelligence-sharing between financial and threat analysis companies is therefore crucial.

Key financial indicators include:

  • Low-value payments paired with contextual anomalies.
  • Minors receiving money from multiple adults.
  • Unexplained cross-border transfers.

Complete and contextual reports help law enforcement interpret data effectively. Educating investigators on financial and blockchain analysis, combined with multi-signal detection linking financial, contextual, and behavioural data, is critical to tackling these crimes.

Hotline Collaboration

In a panel moderated by Abby Roberts, Carolina Piñeros, Executive Director at RedPapaz, Martyna Różycka , hotline manager at NASK, and Kathryn Rifenbark, Director of the Exploited Children Division at NCMEC, examined how hotlines function as the foundation of many national child protection efforts. Ensuring greater consistency and equality among hotlines across the network is essential, and INHOPE supports this through training, knowledge-sharing, and peer-to-peer exchanges, enabling hotlines worldwide to access similar resources and expertise.

In some jurisdictions, however, legitimising hotline operations remains a significant challenge. Frameworks like the EU’s Digital Services Act represent a major step forward, promoting accountability across platforms and formally recognising hotlines as key players in the child online safety ecosystem. While the framework currently applies only within the EU, panellists agreed it could inspire similar models globally, helping to establish greater transparency and guide the development of international standards.

The panellists called for harmonised regulations worldwide, emphasising that all children deserve the same level of protection. They highlighted the importance of using data trends and survivor testimonies strategically to sustain political and public support. Rifenbark concluded, “Collective action is key. We must work together to forecast what is coming and react proactively to stay ahead of offenders.”

Mapping the Regulatory Landscape

The final panel, led by Mike Tunks,, Online Safety Principle at Ofcom explored the evolving regulatory frameworks shaping online safety worldwide. Joined by Filipe Batiwale, Comissioner at the Fijian Online Safety Commission, Arda Gerkens, President of the Dutch Authority on Terrorist Content and Child Abuse Material Online (ATKM), and Patrick Goodliffe, Assistant Director at Coimisiún na Meán, the discussion highlighted how global regulators are aligning on shared principles of transparency, proportionality, and fundamental rights.

The panel noted variations in regulatory powers, as some authorities can issue direct removal orders while others oversee systemic risk mitigation. Cooperation within the Global Online Safety Regulator Network (GOSRN), founded by eSafety, Ofcom, and Coimisiún na Meán, was seen as vital for coherence and information exchange. The discussion also tackled the balance between safety and freedom of expression, with regulators agreeing that digital rights and child protection are “co-dependent pillars” of a safe online ecosystem.

Key Takeaways

The INHOPE Summit 2025 highlighted that combating online child sexual exploitation requires recognising these crimes as part of an interconnected network. Throughout the discussions, the need to move from awareness to action became clear. Speakers underscored the importance of breaking down barriers between technology platforms, financial services, law enforcement, and civil society to build the partnerships needed to address these complex, cross-sector challenges. Real progress depends on genuine collaboration across sectors that have historically worked independently.

On day two we moved into breakout sessions and workshops exploring how to strengthen cooperation in practice. Insights from these sessions showed that hands-on collaboration is key to shaping effective collective action for the future. A full event recap including insights from day two will be made available to in-person attendees only.

As INHOPE President Robbert Hoving concluded, "Alone you go faster, but together you go further." The Summit showed that the community is ready to move from dialogue to action. With strong partnerships already in place and a shared vision for the future, the next step is to turn collaboration into measurable progress that strengthens child protection worldwide.

Thank You

INHOPE would like to thank all speakers, moderators, partners, and attendees who contributed their time, expertise, and insight to the INHOPE Summit 2025. Your continued collaboration and commitment to child protection make it possible to drive meaningful global change.

A Recap of the INHOPE Summit 2025
04.11.2025
'

Alone you go faster, but together you go further.

'