Article
Educational Articles
Industry News & Trends
Partner Updates
Education: Our greatest weapon against the spread of Child Sexual Abuse Material (CSAM)
Research conducted by Facebook has found that 75% of people who shared Child Sexual Abuse Material (CSAM) on their apps during July and August 2020 and January 2021 did so for non-malicious reasons such as outrage, or poor humour.
What can we learn from this?
Education on how to recognize CSAM, and what to do if you come across it is desperately needed. But this news gives us a valuable insight: all of us have the power to make a difference. By educating ourselves, friends, family and colleagues, we have the opportunity to reduce the amount of CSAM circulating on Facebook and perhaps other social networks too by as much as three quarters.
Recognizing CSAM
Some of the most horrific forms of CSAM includes imagery or videos which show children engaged in or depicted as being engaged in explicit sexual activity. But it is important to remember that this is not the only kind of CSAM. To prevent any kind of CSAM from being shared online, we need everyone to be able to recognize all of its forms.
What is considered CSAM differs from country to country. In many countries, images and videos of children who have been instructed to pose in sexualised ways are considered CSAM. This may be easier to recognise when the child is naked, but it might also include images and videos where the children are fully clothed.
Images and videos which are focused on children’s sexual organs are also illegal in many countries, regardless of how innocent the setting might be.
As another example, artificially generated material of children, or verbal depictions of children which fit the descriptions above are also considered CSAM in some countries.
Make sure you know how to recognise CSAM in your country by visiting your national hotline’s website.
What to do if you come across CSAM online
If you ever come across something you suspect might be CSAM, you should always report it to your national hotline. Many platforms also have the opportunity to flag content as inappropriate/illegal. By doing these things, the material can then be checked by a professional who has the opportunity to get it removed.
Never share, tag, or repost material you suspect might be CSAM. Whether the image has been turned into a meme, or you want to share it to show your disapproval, there is a real person behind that image who is being revictimized every time it is shared. Whatever your intentions are, the only way to respond when you come across CSAM online is to report it to your hotline, and flag it to the platform.
By educating ourselves and those around us with how to recognize CSAM and what to do when you come across it, we have the power, together, to reduce the trauma of children whose abuse is shared online.
Read more about the impact you can have by reporting CSAM here.
If you'd like to read more articles like this, then
click here to sign up for INHOPE Insights and Events.