INHOPE - Association of Internet Hotline Providers | Tech Tools for Digital First Responders
Article
Industry News & Trends
Educational Articles
Events & Campaigns

Tech Tools for Digital First Responders

At INHOPE, we hope that one day there will be a hotline in every country. To achieve this, we need dedicated and resilient professionals around the globe willing to spend several hours a day staring at Child Sexual Abuse Material (CSAM) and other illegal content.

In part three of our series on well-being for Digital First Responders, we discuss the technology available to reduce the amount or length of time that Digital First Responders are exposed to traumatic content, and minimise the psychological impact of that exposure.

Find part one and part two here.

Reducing Exposure

Lots of technology exists to reduce the number of times an individual analyst, or different analysts must view the same piece of content. Image hashing technology, for example, associates a unique hash value, or “fingerprint” with an image. By entering this hash value into a hash list, and using this list on technology platforms, technology companies are able to detect and remove previously identified CSAM without it needing to be processed again by an analyst.

Learn more about image hashing here.

INHOPE’s ICCAM platform uses hash technology so that once an analyst working at any of the hotlines in the network has identified an image or video as CSAM, and entered the hash value into ICCAM, it does not need to be processed by anyone working in the network again.

Learn more about ICCAM here.

Reducing Harm

Before analysing content
Categorising and organising new content enables analysts and managers to anticipate and control what they are likely to encounter, when and who by. One way of achieving this is by including a “content type” field in reporting forms, such as those used by many INHOPE member hotlines. This can be built upon with Artificial Intelligence and machine learning tools which indicate the probability that different features, such as nudity, are present in an image or video.

While analysing content
Technology can be used to make the content itself less impactful, such as grey scaling, selective blurring or watching the material without sound. Each of these techniques can help create a psychological distance between the analyst and what they are doing. These tools should be personalised according to what works for each individual.

After analysing content
Lots of technology exists to help destress after intense experiences such as analysing CSAM. Apps can be used to support in developing positive habits, such as meditating, exercising or remembering to take regular breaks. As another example, playing Tetris immediately after analysing content prevents our brain from replaying the event and from continuing with its stress-induced response.

To hear more on this, sign up to the INHOPE Summit 2021 which will be looking at Digital First Responders

Primary Source: Presentation given by Vincent Courson, Trust & Safety specialist at Google, during our 2021 Hotline Training Meeting

Tech Tools for Digital First Responders
20.08.2021 - by INHOPE
'

To hear more on this, sign up to the INHOPE Summit 2021 which will be looking at Digital First Responders

'