Child sexual abuse online is a growing phenomenon. The British National Society for the Prevention of Cruelty to Children (NSPCC) reported a 70% increase in the number of sexual communications with child offences in the last three years. The hotlines for missing children have reported concerns that grooming is an increasing factor in disappearances, particularly in cases of children who run away. In 2020, the second most common reason for children to contact Child Helplines was for reasons of violence of which 3,2% was online sexual abuse. Disappointingly, the COVID pandemic has exacerbated the problem. The COVID-19 crisis pushed 1.2 billion children out of schools, leading to children spending many more hours online. As Europol emphasised in its 2021 Internet Organised Crime Threat Assessment (IOCTA) report, ‘during lockdowns, children spend an even larger part of their day online, which has led to a steep increase in online grooming, particularly on social media and online gaming platforms. This has led to a huge increase in child sexual abuse. There is little research on grooming as a specific phenomenon in relation to missing. The research that exists has limited sample size and is from 2014, but all victims experienced on-line grooming that led to sexual abuse both on and off-line.
CESAGRAM aims to enhance the understanding of the process of grooming, and more particularly how it is facilitated by technology and how it can lead to child sexual abuse and missing, in order to, based on this better understanding: (a) ensure that grooming for sexual abuse and potentially leading to going missing is prevented insofar possible, (b) ensure that victims of grooming are identified and receive appropriate support before, during and after a disappearance. The proposed project will mainly target young people 11-14 who are at risk of grooming or are victims of grooming themselves and are at risk of missing, carers and teachers, frontline professionals (social workers, NGOs, law enforcement) as well as policy makers. It will support the implementation of the existing legislation on child sexual abuse along with the European Commission’s Child Sexual Abuse Prevention Network (CSAPN) and the future European Centre against Child Sexual Abuse, as well as the relevant strategies developed by the European Commission (Child Rights Strategy, EU Child Sexual Abuse strategy, European Strategy for a Better Internet for Children).
MKLab is responsible for the design, development and evaluation of a set of AI tools that will facilitate the prevention and detection of grooming content online, leading the relevant Work Package on Early identification and prevention of online grooming with particular focus on the tasks related to (a)system requirements and architecture, (b) monitoring of online spaces for grooming related content, (c) linguistic analysis for the detection of grooming activities and (d) AI-based risk assessment for decision support and early warning generation.