INHOPE - Association of Internet Hotline Providers | What is self-generated CSAM?
Article
Educational Articles

What is self-generated CSAM?

Self-generated Child Sexual Abuse Material (CSAM) is sexually explicit content created by and featuring children below the age of eighteen. These images can be taken and shared intentionally by minors, but are in many cases a result of online grooming or sextortion.

Young people are spending more and more time online; Amplified through the pandemic using the internet for educational, entertainment or social purposes has become the new normal. This has created a reality in which children spend an increasing amount of unsupervised time online, which in turn has facilitated rising cases of self-generated CSAM.

Young people's views on self-generated content

Based on the results of a survey by ECPAT Sweden in 2021, most children regard sharing nude images as a natural part of their sexual exploration. The report furthermore revealed that while most kids claim to be aware of the risks, they nonetheless often participate in the creation and distribution of intimate images. The normalisation of sharing nudes as well as assumed trust between two parties exchanging the content is what can explain this behaviour in some of the cases. In other instances this can be attributed to peer pressure, the desire to fit in or can be a result of coercion or grooming.

Distribution of self-generated CSAM

Sexting: This describes the intentional exchange of sexually explicit messages or images. Sharing nudes is by far the most common form of self-generated content among young people. Though, regardless of whether the material is exchanged consensually, the creation, possession and distribution of sexually explicit images of children below the age of eighteen remain a punishable offence.

Non-consensual sharing of consensual images: While sexting is the consensual exchange of sexual content between two parties, this type of distribution refers to cases in which content is consensually exchanged, but then distributed to third parties without consent. There are many reasons why this is prevalent among young people. Among boys, sharing intimate images they have received is sometimes considered as a form of bonding. But distributing such content can also have strictly malicious intent, such as public shaming or revenge porn. Regardless of the intention, sharing content that was only intended for one person is not only a violation of trust but a violation of the law.

Sextortion: Sexual extortion, also referred to as sextortion, is a kind of blackmail, in which the perpetrator demands favours, sexual or otherwise under the threat of distributing intimate and sexually explicit material. Perpetrators of sextortion can get hold of the content in many different ways. Regardless of how the material is acquired, the fear of the content being shared among the victim's network can cause extreme feelings of shame, anxiety and helplessness, which in turn perpetuates the cycle of abuse.

Prevention through Education

How can we approach this issue when we know that taking and sharing intimate images is normalised among young people?

Openly discussing the risks of sharing intimate images with young people is one of the first steps in approaching this issue. The evolving digital environment requires us to adapt our educational systems to prepare young people to safely navigate digital spaces. Children must be made aware that creating self-generated CSAM is not only illegal but can have devasting consequences for their safety and mental health. Guardians, parents and teachers need to be included in these conversations so that not only the children but the support system around them is aware of the potential threats.

Help & Resources

Childline and the Internet Watch Foundation's Report Remove tool allows young people to report a nude video or image shared online to be removed. This tool provides a child-centred approach to image removal which can easily be done online.

Similarly, the National Center for Missing and Exploited Children (NCMEC) supports young people in getting sexually explicit material removed online. The content can be directly reported to their CyberTipline.


If you’ve come across sexual content of yourself or another minor, contact your local hotline to have it removed by clicking here.

Learn more about INHOPE and our mission.

What is self-generated CSAM?
10.11.2022
'

If you’ve come across sexual content of yourself or another minor, contact your local hotline to have it removed by clicking here.

'