Article
Educational Articles

What is self-generated CSAM?

Self-generated content or self-generated child sexual abuse material (self-generated CSAM) is sexually explicit images or videos featuring minors under the age of 18 and shared online. It is an emerging trend that has varying methods of production such as consensual sexting. In some cases of self-generated content, children intentionally produce and send these images, selfies, or videos to friends on social media.

Types of Self-generated CSAM

Sexting: This is the sending of intimate, sexually suggestive, or explicit messages or photographs, typically via mobile phone. Sharing images such as nudes is by far the most common of self-generated content amongst young people. According to Thorn’s Self-generated CSAM report, kids do not view this as fundamentally bad and 40% of teens agreed that it is normal for people their age to share nudes.

Additionally, the age is key here because this content being shared between minors is still considered illegal content. Be mindful of the fact that that it doesn’t need to be an individual over the age of 18 to commit what is considered an offence.

Sextortion: Predators will demand sexual favours, money, or other benefits from the minor under the threat of sharing their self-generated content. The challenge here is that this content can be identified and circulated by offenders online and offline for harmful purposes.

Children that share self-generated sexual images run the risk of this being shared and circulated both offline and online. It can also be used as a basis for sexual extortion and, minors may not fully grasp or understand the consequences.

The scope of the issue

What we need to be cognizant of is that self-generated content does not occur because of the actions of one person. The scope of the issue really starts with education. There is a heavy lack of knowledge of what CSAM is and that taking a photo of oneself is still considered CSAM and the understanding that the child’s content is online forever.

IWF’s annual report states that “child abuse material on the internet indicates that a third is now self-generated.” Most of this content is created when the victim is home or in their room through a mobile device, on a desktop or laptop and through a webcam. Typically, the minor may not be aware that they are being photographed or recorded when using the computer. While this content can be voluntarily self-produced by young people, oftentimes offenders are dictating the act.

A UNICEF report states, “when using the term “self-generated,” it is important to be aware of the risk of implicitly or inadvertently placing the blame on the child who has produced the image against his/her will.”

For investigators and digital first responders, it can be challenging to analyse these images alone without knowing how it was produced, especially since the subject of the illegal material may not be identifiable. The IWF reports that since the start of 2021 the amount of self-generated child sexual abuse material has skyrocketed. 64,278 have been confirmed to contain self-generated material where children have been tricked, groomed, or coerced into abusing themselves on camera.

How can I help combat the fight against self-generated CSAM?

What action should we take to stop the spread of these harmful and illegal images? Start having an open discussion with children at an early age on being active in an online environment. Monitor and supervise their activity on digital devices. Ensure that they have proper security settings for social media sites to protect their privacy and keep them out of danger.

Childline and the Internet Watch Foundation's Report Remove tool allows young people to report a nude video or image shared online to be removed. This tool provides a child-centred approach to image removal which can easily be done online.

If you’ve come across sexual content of yourself or another minor, contact your local hotline to have it removed by clicking here.

What is self-generated CSAM?
29.11.2021 - by INHOPE
Photo by INHOPE
'

If you'd like to learn more about topics like this, then
click here to sign up for INHOPE Insights and Events.

'