In the past two years the reporting of child sexual exploitation and abuse online has reached its highest levels with the US National Center for Missing and Exploited Children (NCMEC) processing 60,000 reports of child sexual abuse online every day, a recent report shows.
Another report published in 2020 stated that India has seen a 95% increase in internet searches for child sexual abuse materials during the coronavirus disease (Covid-19) pandemic.
The findings of 2021 Global Threat Assessment report by WeProtect Global Alliance show that the scale of child sexual exploitation and abuse online is increasing at such a rapid rate that a step change is urgently required in the global response to create safe online environments for children.
The 2021 Global Threat Assessment report details the scale and scope of the threat of child sexual exploitation online and aims to encourage action on the issue to reduce the risk to children and prevent abuse before it happens.
As part of the report, a global study of childhood experiences of more than 5,000 young adults in the age group of 18–20 across 54 countries was completed by Economist Impact. More than one in three respondents (34%) had been asked to do something sexually explicit online they were uncomfortable with during their childhood.
The findings in the report revealed that the scale and complexity of child sexual exploitation and abuse is increasing and is outstripping the global capacity to respond.
The report also surveyed technology companies that showed most are using tools to detect child sexual abuse material (87% use image ‘hash-matching’), but only 37% currently use tools to detect online grooming.
The Covid-19 pandemic is undeniably one contributory factor behind the spike in reported incidents. The rise in child ‘self-generated’ sexual material is another trend that challenges the existing response with the Internet Watch Foundation observing a 77% increase in child ‘self-generated’ sexual material from 2019 to 2020.
“The internet has become central to children’s lives across the world, even more so as a result of the Covid-19 pandemic. Over the past two years, we have observed an increase in the scale and complexity of child sexual abuse online,” Iain Drennan, the executive director of WeProtect Global Alliance said, urging for enhanced global response and create a safer digital world for children.
While a strong law enforcement and judicial response is essential, a truly sustainable strategy must include active prevention of abuse. There is a need to ensure the creation of safe online environments where children can thrive.
“There is enough evidence to infer that lockdowns and disruptions caused due to Covid-19 have significantly contributed to the spike in online child sexual exploitation and abuse. The easy access to abusive content involving children is deeply concerning,” Tarang Khurana, the chair of Confederation of Indian Industry’s Young Indian’s Project Masoom said.
“The universality of the problem requires collaborative action, and we all have a shared responsibility to end this. Along with strong law enforcement it is imperative to create empowered individuals, society, and community, who come together to uproot this systemic menace,” Khurana added.
The Economist Impact survey also demonstrated that girls, and respondents who identified as transgender or non-binary, LGBQ+ and disabled, were more likely to experience online sexual harm during childhood.
It also showed that respondents who identified as racial or ethnic minorities were less likely to seek help.
The report revealed that 57% of female and 48% of male respondents reported at least one online sexual harm. While 59% of respondents who identified as transgender and non-binary experienced an online sexual harm, compared to 47% of cisgender respondents, 65% of respondents who identified as LGBQ+ experienced an online sexual harm, compared to 46% non-LGBQ+ people.
It also showed 57% of disabled respondents experienced an online sexual harm, compared to 48% of non-disabled respondents. It further showed that 39% of racial or ethnic minority respondents would delete or block a person sending them sexually explicit content, compared to 51% of non-minority respondents.
The report also showed 17% of racial or ethnic minority respondents spoke to a trusted adult or peer about the content, compared to 24% of non-minority respondentsThe questionnaire asked respondents about their exposure to online sexual harms and their risk factors during childhood.