Article
Events & Campaigns
Industry News & Trends
INHOPE Summit 2024 Recap
On October 8 and 9, INHOPE held its seventh annual INHOPE Summit, sponsored by Microsoft in New York.
The theme, 'Knowledge is Power, Reporting is Action', aimed to facilitate discussions around the value of public reporting and the importance of widespread educational initiatives. With 80 in-person attendees and 120+ virtual participants, we explored the impact of public service announcements and cross-sector strategies for effective communication and awareness-raising.
Welcome & Opening
Jenni Katzman, Senior Director of Government Affairs at Microsoft, kicked off the Summit with a warm welcome from our host, emphasising the importance of global partnerships in addressing online child sexual abuse material (CSAM). Jenni highlighted the need for cross-sector collaboration, stressing that tackling CSAM requires coordinated efforts between governments, tech companies, and civil society. Their PhotoDNA technology, a key tool in detecting and removing CSAM, was showcased as an example of how innovative solutions can drive impactful change when shared widely across industries. As we enter the age of generative AI, Microsoft's commitment to safety-by-design and the need for modernised legislation underscored the urgency of adapting to emerging threats.
Samantha Woolfe, INHOPE’s Head of Partnerships and Network Expansion followed with a call to prioritise proactive and sustainable solutions. She stressed that reporting should become second nature and advocated for national and global educational initiatives that lead to measurable changes in online behaviour and child protection.
A First-Hand Account
Rhiannon-Faye McDonald, Head of Advocacy at the Marie Collins Foundation, delivered a powerful testimony of her experience with technology-assisted child sexual abuse at the age of 13. What began as an innocent online conversation quickly escalated into blackmail, coercion, and assault. The impact of her images being taken and shared online continues to linger to this day. “That contact abuse had an end day. It happened one morning for a few hours, and then it stopped. These images live on. We can't uncreate them.”
Behind every statistic is a real child, with lasting emotional scars, Rhiannon stressed – around 74% of them report experiencing ongoing shame. 54% of people mistakenly think victims are willing participants in creating ‘self-generated’ CSAM. In many cases, the child is coerced at every step of the process – down to the poses they are told to take in the images. This misconception can add to the blame victims feel, with 50% believing they are responsible for their abuse. As a result, one-third of victims hesitate to share their stories and half even deny the existence of the images. Rhiannon urged everyone to understand that these images are never ‘just images,’ and called for compassion and understanding in our approaches to tackle technology-assisted abuse.
Keynote Address
Mike Masnick CEO and Founder of Techdirt and CEO of the Copia Institute. discussed how Public Service Announcements (PSAs) can bridge the knowledge gap between stakeholders when addressing complex issues like CSAM. He positioned PSAs as essential tools for transparency and information-sharing across the ecosystem. According to Mike, clear communication and understanding of legal requirements are critical for collaboration, as the flow of information between stakeholders underpins the fight against CSAM.
The Public's Understanding of CSAM
A dynamic panel discussion, moderated by Emily Cashman Kirstein, Child Safety Public Policy Lead at Google, brought together experts to explore how public awareness can combat CSAM. The panel, featuring Robbert Hoving President of Offlimits, John Pizzuro, CEO of Raven, and Yi-Ling Chen from ECPAT Taiwan, discussed key themes, including:
- Peers & Trusted Adults: : Many young people prefer to confide in peers rather than adults, highlighting the importance of peer-led education initiatives that empower youth to support each other.
- Impact of COVID-19: The pandemic increased children's screen time leaving them more vulnerable to online risks. The rapid shift to digital platforms caught many parents and schools off guard, causing worrying gaps in online safety education.
- Generative AI: AI-generated CSAM complicates detection, removal, and investigation, underscoring the need for legal updates.
All panellists agreed that raising awareness and educating the public is not just about delivering information, but requires shifting perceptions and tackling common misconceptions about CSAM. Speakers highlighted the importance of tailoring messages to specific audience, such as utilising multimedia educational campaigns that engage young people. Additionally, the panel; advocated for robust partnerships between platforms, law enforcement and hotlines to exchange expertise and develop strong, united and effective approaches.
The Persuasive Effects of PSA's
Chris Cascio, Associate Professor at the University of Wisconsin-Madison, touched on how these educational campaigns can be effective in practice, as he presented valuable research on the persuasive effects of public service announcements. He noted how as a father he had realised how many parents are unaware of online risks and stressed the need for public education on these issues. Cascio’s research revealed three important findings: First, messages that include self-affirmation can positively change behaviour. Second, warning labels can reduce the likelihood of people sharing risky content. Lastly, adolescents who are sensitive to social feedback are more vulnerable to online risks. These neuroscience-based insights show how public campaigns can be designed to trigger the right brain responses, driving meaningful behavioural change.
Government-Backed Action Against CSAM
Kristen Best, Principal Director at the
U.S. Department of Homeland Security (DHS) and Jacqueline Beauchere Global Head of Platform Safety at Snap Inc. showcased their collaborative initiatives to combat CSAM, underscoring the value of public-private partnerships. Kristen outlined DHS's new mission, which prioritises the protection of victims and combating exploitation, resulting in the creation of a national strategy for 2025 in cooperation with the Department of Justice. A significant component of these efforts includes the formation of the Australia-U.S. Joint Council, which focuses on multidisciplinary responses to child sexual exploitation, particularly in Southeast Asia.
Jacqueline discussed Snap’s active role in these efforts, highlighting their partnership in the Know2Protect campaign—the first U.S. federal programme dedicated to preventing online child exploitation. The campaign incorporates various forms of education and awareness-raising, including targeting teens through Snapchat. Through this partnership, the campaign leverages interactive tools like a fun quiz-based Snapchat Lens to teach young users about online safety.
In total, Know2Protect has achieved 250 million impressions, engaging over 180,000 people through educational presentations and leading to 75 victim disclosures. Snap has also contributed to spreading the campaign’s message through donated paid media and continuous research with young users to better inform these efforts.
How to Educate Users on Online Safety
In an insightful interview, Rhiannon-Faye McDonald and Jun Young Ro, Trust and Safety and Compliance Leader and former Vice President of Global Affairs at ZEPETO discussed how social networks can better educate users on online safety.
Jun highlighted the challenges and opportunities in delivering these messages, emphasising innovative methods like gamifying safety education, as seen in Zepeto's Soteria. Both speakers stressed the importance of balancing responsibility with empowerment, ensuring that users don’t feel burdened or blamed for online risks. Education was a key focus, with a strong call for mandatory online safety training in schools, similar to sex education, to help children navigate the digital world confidently. Key takeaways included:
- Communication Approaches: Avoid fear-based messaging, as it can discourage open communication and lead young people to hide issues.
- Tailored Messaging: Cultural and legal differences can complicate companies' efforts to address online safety threats. Messaging must be tailored to specific cultural contexts to remain effective.
- Timing matters: Young people are often more open to safety messaging during times of crisis.
- Knowledge-sharing and Cooperation: Industry-wide collaboration and open-source tools are vital for providing smaller social platforms with resources to maintain consistent safety standards globally.
Tools for Self Protection
Although preventive education is vital for online safety, many children still find themselves facing crises without a trusted adult to turn to, leaving them isolated and vulnerable. To bridge this gap, INHOPE founding members, the Internet Watch Foundation (IWF) and the National Center for Missing & Exploited Children (NCMEC), have developed essential tools that empower young people to regain control in situations where intimate images have been shared online without their consent.
Chris Hughes, Hotline Director at the IWF talked about the development of Report Remove, a tool created in collaboration with Childline, aimed at enabling young victims to report non-consensually shared images. Originally a URL reporting service, it has been expanded to allow image uploads, leading to a significant increase in the number of victims seeking help. Recognising that mandatory age verification was a barrier for many underprivileged children, the IWF made the ID requirement optional. As a result, 87% of users now upload images without verification, enabling more victims to report harmful content.
Lauren Coffren , Executive Director of the Exploited Children Division at NCMEC presented Take It Down, a service for people to get images removed from the internet that were taken when they were underage. Take It Down functions entirely anonymously by detecting CSAM through a hash-based system. Together, these initiatives highlight the critical role of technology in providing innovative solutions for people to take back control of their private images.
Balancing Education & Intervention
Sean Litton , President and CEO of the Tech Coalition, facilitated the last session, an open Q&A including Henry Adams leading the Trust and Safety Partnerships and Strategy at Resolver, a Kroll Business , Amy Ulucay, Head of Minor Safety and Exploitative Content at Discord, and Ebony Tucker, Exploitation and Abuse Partnerships Lead at TikTok.
The speakers explored the balance between changing online behaviour and interactions. Panelists agreed that both processes should work in tandem. A platform adhering to a set of safety principles should educate users while intervening in harmful activities. Tech companies are uniquely positioned to provide real-time interventions, especially since harmful content is often generated on their platforms. Effectively doing this requires solid cross-sector partnerships. Sean touched upon the value of initiatives like the Tech Coalition's Lantern programme, which allow for signal sharing across tech companies and financial institutions to prevent and address harmful behaviour online.
Reflections
We concluded the hybrid day of the INHOPE Summit 2024 with closing remarks from Samantha Woolfe. She reflected on the impactful discussions and presentations throughout the day, highlighting the need for a thoughtful approach to these complex issues. “We need to be mindful of what we are saying, but we do need to talk.” Only through strong multi-stakeholder cooperation can we create victim-centred messaging that effectively resonates with audiences around the world.
The INHOPE Summit 2024, created an invaluable platform for stakeholders to unite in addressing the challenges surrounding online child safety. Conversations emphasised the need for proactive and sustainable solutions, encouraging collaboration across sectors to raise awareness and mobilise the public. Online safety is a shared responsibility, requiring the active involvement of all stakeholders in exchanging expertise, sharing data and developing comprehensive strategies. By fostering these partnerships we can create a safer digital environment for children worldwide.
Safe your in-person spot for the INHOPE Summit 2025 by pre-registrering here.
Thank you to our Speakers & Partners
INHOPE's vital work and events wouldn’t be possible without the generous support of our annual funding partners. If your company is committed to combating child sexual exploitation and abuse online, consider becoming an INHOPE partner for 2025. Together, we can make a difference! Explore our Partnership Pack.
We'd like to express our gratitude to all of our amazing speakers who dedicated their time and resources to support the INHOPE Summit 2024. A big thank you to organisations from day one: Microsoft, the Marie Collins Foundation, the Copia Institute, Google, Offlimits, Raven, ECPAT Taiwan, University of Wisconsin-Madison, Snap Inc., Know2Protect, ZEPETO, Internet Watch Foundation (IWF), the National Center for Missing & Exploited Children (NCMEC), Tech Coalition, Resolver, a Kroll Business , Discord, TikTok, and day two: TRM Labs, Tether, Crystal Intelligence,Binance, Scotiabank, Chainalysis, OSCE, and Ofcom
Want to get involved in child online safety? Explore our Partnership Pack
'