INHOPE | Understanding the work of Digital First Responders Part 2
Article
Events & Campaigns

Understanding the work of Digital First Responders Part 2

Day two of the INHOPE Summit was about proactive conversations and collaborations with law enforcement, key tech players and the INHOPE network.

Removing CSAM nationally, how hotlines ensure the removal of content

We kicked off by addressing how hotlines ensure the removal of child sexual abuse material (CSAM) nationally with Peter-Paul Urlaub from eco Germany. Having a national context is fundamental because culture and language and the application of legislation is country-specific. In other cases, some hotlines have mandates and are named international law which makes the role of a hotline transparent.

“We do this work because it has a direct impact and makes the game difficult for offenders," stated Peter-Paul. In Germany, hosters are liable once they are aware that content is reported; the law works in favour of self-removal. The average removal time in the last 3 years was 2.7 days and 2.5 days which includes weekends and holidays.

How law enforcement prioritise cases of CSAM

This topic was covered by Cathal Delaney from EUROPOL, who asked how does law enforcement prioritise these cases of CSAM?

One of the purposes of ICCAM is to ensure CSAM material ends up in law enforcement databases for investigation. Within law enforcement agencies (LEA), there are many roles of digital first responders – from undercover operations to victim identification all working on the front line. For LEA, the priority is always the safety of the victim. EUROPOL uses intelligence gathered with partners to assess material against its database of over 61 million unique images and videos of CSAM and decide how to proceed. Prioritisation is essential to allow law enforcement to act swiftly and aid a child in danger based on the information gathered.

Bridging the gap and improving information flow

Michelle DeLaune chaired our panel discussions, starting with Alexandra Evans from TikTok and David Miles from Facebook. We asked them their approach for preventing any appearance of CSAM.

David shared that Facebook works on community standards and abuse prevention. “We work to develop a taxonomy. We implemented popups during the search from offender rehabilitation organisations and we have a safety alert about the harm of sharing."

Alexandra stated that TikTok has a zero-tolerance approach towards all forms of CSAM and exploitation. “We utilise the Safety by Design concept and are an open platform with aggressive moderation strategy. Transparency is incredibly important for understanding the nature of the threat.“ TikTok has been releasing transparency reports since 2019, expanding on the information provided.

Michelle posed the question, "What does your organisation do to safeguard the well-being of the Trust & Safety teams?" David mentioned that Facebook has Trust & Safety teams located in 120 countries and partners with outside companies to ensure different language expertise. “We helped partners manage the changes during work from home and COVID - everyone has access to trained counsellors on site.”

A dive into data and reporting trends

We dove into the rise of youth-generated sexually explicit content with Rebecca Sternburg from NCMEC and Chris Hughes from the IWF.

Rebecca stated that NCMEC has seen a ramp-up in blackmail and sextortion cases since 2020. Chris pointed out that the IWF saw an increase of 77% of self-generated content in 2019 and 2020. The vast majority of this kind of abuse is taking place in victims’ homes. Girls between the ages of 11-13 are highly targeted.

Chris reported on the steps the IWF is taking to tackle youth-generated sexually explicit content via its partnership with NSPCC on the tool Report Remove which allows children to report a nude image or video of themselves that’s appeared online. The data influence capacity building and expansion plans. “NCMEC works with international LEA to work out which agencies are best-placed to receive cyber tip line reports and investigate them.”

Exploring the impact that tackling CSAM has on an organisation

Thereafter, we spoke with Lynette Owens from Trend Micro and Nicky Jackson Colaco from Roblox. Nicky shared the unexpected lessons in tackling CSAM and advised up-and-coming consumer platforms online on what to look out for. Roblox is a platform for under 13s so there is no expectation of public-square freedom; therefore there is significant protection already built-in.

“What we are seeing is that users hop between platforms; there isn’t a good safe and secure way to share consumer data between platforms. An additional problem is that kids are getting devices at a younger age. The proliferation of new CSAM is going to be a real issue," Nicky remarked. If you could imagine a world without PhotoDNA, live streaming, voice platforms – there are still absent services where detection does not currently exist. Additionally, privacy concerns make this even more complex.

Lynette explained that there is difficulty in completely automating content review and removal work. “What recently comes to mind is that the organisations tackling CSAM are isolated from the other harms that are discussed in the safety world. There is a need to break down the walls and not look at CSAM in a myopic way.”

We closed the day with a word from our Executive Director, Denton Howard, on the future of the fight against CSAM. “CSAM is a global issue. By working collectively, we can be part of the global response. We need to work toward more alignment in technical standards, legislation, and other anti-CSAM measures. Our pathway to the future is based on protocols, standards, and training; network expansion; and being a voice against CSAM.”

Read about Understanding the work of Digital First Responders Part 1.

This Summit marks an opportunity to have an open forum discussion. You don't have to wait until the next INHOPE Summit to join this discussion. If you'd like to get more involved, leave us your details here to receive information on upcoming events and activities.

Understanding the work of Digital First Responders Part 2
01.10.2021 - by INHOPE
Photo by INHOPE
'

Our pathway to the future is based on protocols, standards, and training; network expansion; and being a voice against CSAM

'