What is CSEM?
Child Sexual Exploitation Material (CSEM) online refers to sexualised content depicting minors that is exploitative in nature but does not fall within the classification of nationally illegal child sexual abuse material (CSAM). It can also include non-illegal images in a series with CSAM as exploitation material, due to its investigative relevance and the context of exploitation in which it was generated.
Any sexual material depicting children created through abuse or exploitation must be removed from the internet. But while European legislation clearly criminalises CSAM, for CSEM the situation is more complicated. Deviation in legislation and terminology across different countries can slow and even hinder the removal process for these materials. To protect children online, we must expand our global understanding of CSEM and recognise that it constitutes an offence. Everyone working in the child protection space must be aware of the signs of exploitative behaviours and government actors must work to appropriately address exploitation in national legislation.
What can exploitative behaviour look like?
Grooming: Befriending and developing a relationship with a minor by gaining their trust and then slowly lowering their inhibitions towards sexual activity. Online grooming can include anything from “friendly” conversations online, showing a lot of interest in the child’s personal life, asking sexual questions, sending or asking for intimate images, and sending gifts, or money.
Sexting: The exchange of sexual messages or photos/videos. Can go hand in hand with grooming, or appear to be consensual between two minors.
Sextortion: Coercing a victim into producing and sending sexually explicit images or videos through threats, gifts or manipulation. The extorted material is often used to threaten the victim to further comply with sexual demands.
NCII: Non-consensual intimate image (NCII) abuse refers to any scenario in which intimate content is being produced, published or reproduced without consent.
Live-streamed exploitation: When child sexual abuse and exploitation is broadcasted live to an audience through online streaming services. Offenders can often pay, direct and view the abuse from their homes.
INHOPE hotline analysts around the world are experts in detecting and removing CSAM. But the nature of CSEM can make it more challenging to identify and investigate. Exploitative behaviours often occur through a variety of different channels, such as audio, text and video content. So even in instances in which clearly illegal material was generated the behaviours leading up to this abuse such as contextual cues or grooming are more challenging to track.
Furthermore, across many countries in Europe and beyond, the legislation does not properly define or address CSEM. In many cases, exploitation is either intertwined with CSAM or not criminalised at all. In order to be able to identify, process, investigate and prosecute cases of child sexual exploitation, we must work towards a harmonised definition and implementation of laws addressing child sexual abuse and exploitation material.
Learn more about the current landscape of CSAM and CSEM, their legal differences and how we must adjust our global responses to effectively tackle these crimes, in our event recap of the European Virtual Forum (EVF). The EVF aimed to identify network priorities in collectively tackling this emerging subject and expanding our global collaboration to achieve the best impact in fighting it.
Learn more about the current landscape of CSAM and CSEM, their legal differences and how we must adjust our global responses to effectively tackle these crimes, in our event recap of the European Virtual Forum (EVF).'