Events & Campaigns
Webinar Recap: A case study of industry and hotline collaboration
Stakeholder collaboration is a critical aspect of successfully combatting child sexual abuse material (CSAM) online. INHOPE welcomed Vincent Courson, Trust & Safety Partnership Manager at Google and Yann Lescop, Legal Advisor for Trust and Safety Point de Contact for an internal webinar, to showcase their successful hotline/industry partnership.
Vincent Courson, who has over ten years of experience in T&S policy enforcement, comms, and partnership work outlined how the Google Child Safety Toolkit can assist hotlines more effectively in processing reports. Yann Lescop elaborated on how the toolkit benefits Point de Contact in practice.
What is the Google Child Safety Toolkit?
The Google Child Safety Toolkit is a set of two tools (CSAI Match Tool & Content Safety API) made available for free by Google and YouTube to qualifying partners, to help them enhance their internal content moderation in detecting and removing CSAM online. The tools are created to support already established analyst or content moderation processes by helping analysts prioritise incoming reports to improve the efficiency of review queues.
What is included in the Toolkit?
CSAI Match Tool
- Hash matching software used for identifying and matching known abuse videos.
- Provides a yes or no answer regarding whether or not the content has been previously identified.
- The tool is designed for scalability, enabling large volumes of videos to be sent, and providing results quickly.
Content Safety API
- Machine Learning classifier that provides partners with a likelihood that a piece of content (image or video) is CSAM.
- It is then up to the partner to decide which content to review manually, to determine if actions need to be taken.
- Large volumes of images or videos can be sent through this tool.
Point de Contact and the Google Child Safety Toolkit
Yann Lescop outlined how Point de Contact has benefited from the Safety Toolkit by implementing the Content Safety API in their daily analyst work. The tool is integrated into all windows used for work and alerts analysts anytime submitted content has been successfully assessed by showcasing the results from one to five with a white-to-red colour gradient. This assists analysts in identifying content most likely to be CSAM and prioritising their reviews accordingly.
The tool not only facilitates more efficient content processing but also enhances working conditions for analysts by managing the exposure of harmful material. With the help of the tool, analysts can prepare themselves mentally before reviewing the most disturbing material, or even avoid it entirely on days they prefer to not be exposed to this type of content. The toolkit however cannot replace analyst work, both speakers emphasised. Human content review is critical with the tool acting merely as support to enhance and alleviate some of the hotlines' workload.
Is your hotline interested in using the Google Child Safety Toolkit? Click here to learn more.
Is your hotline interested in using the Google Child Safety Toolkit? Click here to learn more.'