Skip to Content

Blog

Unseen Threats: Online Child Sexual Abuse and Opportunities for Preventing it.

If your organization is serving consumers in person in your facilities or through off-site programs, online sexual abuse may seem distant from your day-to-day operations and safety protocols. Unfortunately, as online and in-person relationships become increasingly intertwined, organizations have a growing responsibility to be aware of and guard against online abuse.

What is Online Child Sexual Abuse?

While there is no single official definition of online child sexual abuse (online CSA), also called technology-facilitated CSA, researchers, law enforcement, and Nongovernmental Organizations (NGOs) use these terms to describe multiple categories of interactions through online channels like social media, gaming platforms, texting, or other messaging/image-sharing means1:

  • Online grooming (for the purposes of engaging in additional online CSA, to further  in-person grooming efforts, or leading to in-person encounters and abuse)
  • Non-consensual sexting (sharing sexual images without consent)
  • Creation or sharing of images or videos, often referred to as CSAM (Child Sexual Abuse Materials)
  • Sextortion (threatening to disseminate images to extort money or additional images)
  • Sexual solicitation (for creation or sharing of CSAM or for in-person or online sexual acts)
  • Online commercial sexual exploitation (exchanging things of value for sexual images or online sexual interactions).

In a recent keynote address for abuse prevention practitioners, researchers, and advocates, Simon Bailey, CBE, QPM, DL said succinctly: “There’s never been, to our shame, a better time to abuse.”2 To date, Child Rescue Coalition has identified 72.5 million unique IP addresses worldwide that have shared or downloaded CSAM, and reports of online abuse are increasing every year. According to the National Center for Missing and Exploited Children (NCMEC), reports of online enticement increased by more than 300% from 2021 to 2023.

New Platforms, Similar Challenges

Online CSA may look and feel different than conventional CSA due to the perceived distance and anonymity created by an online vs. in-person encounter. However, online CSA requires the same three key factors you protect against when designing your abuse prevention practices: access, privacy, and control. Online CSA offenses are also associated with many of the same victimization risk factors as in-person sexual abuse, “including parental maltreatment, bullying, other forms of victimization as well as female gender and sexual minority identity.”3

And while stereotypes persist about anonymous online predators, there is evidence that so-called stranger danger may not be the most prevalent concern. Recent studies provide “evidence that a significant amount [of] online CSA may be perpetrated by an individual who was known to the child offline.”4

The most recent findings from the National Juvenile Online Victimization Survey (NJOV) indicate that the majority of harmers (62%) were not strangers.  A 2023 study investigating the role of technology in the perpetration of in-person abuse found that 23.5% of those who experienced online solicitation for sexual images or in-person sexual activity knew the perpetrator. That number grew to an alarming 79.5% of those who reported experiencing online grooming – nearly eight out of ten online grooming victims were groomed by someone they knew.  And these perpetrators are not all adults.

Online Abuse and Youth-to-Youth Interactions

An estimated 30-50% of all child sexual abuse cases are perpetrated by youth under 18. Additionally, a 2022 study of online CSA in the US found nearly identical results when looking specifically at abuse in the online space. In online abuse cases where the age of the perpetrator was known, between 32-52% were under 18.4

Additional research highlights the complexity of this problem.  A 2023 study reported that among their respondents who had experienced online CSA:

  • 88% of the abusive sexual imagery produced was made by youth.
  • 73% was created by the victims themselves.5

But this can be misleading – victims are not willingly engaging in their abuse. While these images are sometimes created under duress, they are often shared voluntarily in the context of a consensual intimate relationship but then misused or shared without consent.

An Alarming New Twist – AI & Deep Fakes

As we shared in our 2024 Praesidium Report, the increasing sophistication of artificial intelligence (AI) is also compounding this already thorny issue. Generative AI technology is now creating CSAM, which is often indistinguishable from authentic images.

In their 2023 report How AI is being abused to create child sexual abuse imagery, the Internet Watch Foundation (IWF) indicates significant potential for the rapid growth of AI-generated CSAM. Talk-to-text image technology allows individuals to enter text describing what they want to see into an online generator. AI software then creates the image or images. According to IWF, “technology is fast and accurate… many images can be generated at once – you are only limited by the speed of your computer.”

The analysis also provides “reasonable evidence that AI CSAM has increased the potential for re-victimization of known child sexual abuse victims” by creating new images from existing CSAM images of victimized youth, which can result in additional exploitation, bullying, extortion, and harassment.

Online Child Sexual Abuse & Your Organization – What Can You Do?

Your programs and spaces naturally allow relationships to form with and between consumers. You can’t control events happening outside of your program. Still, you can limit opportunities for inappropriate online contact between your employees, volunteers, and consumers by consistently enforcing appropriate policies and raising awareness. Click on the links embedded below for additional guidance.

  • Maintain vigilance around your existing abuse prevention and consumer protection policies, particularly related to youth-to-youth interactions, red flag behaviors, and grooming. Addressing online abuse can seem overwhelming, but keep in mind that it is often a precursor to or an extension of in-person interactions and relationships. Educate about online abuse as an expansion of your ongoing consumer protection efforts, responding to and shining a light on an increasingly interconnected environment.
  • Examine your programs to evaluate risks appropriately. Do you run programs that require or allow for online interaction? Youth development programs may offer options like Esports, virtual classrooms, podcasting, participating in TikTok challenges, etc. These can be engaging, educational, and valuable for relationship-building. However, they can also expose you and your consumers to risky online environments. Assess all facets of your programs to determine whether you’ve adequately protected your organization and your consumers.
  • Review, enforce, and routinely update your organization’s Social Media, Electronic Communication, and Outside Contact Policies. Because the online environment is constantly evolving, your policies and practices must keep pace with new platforms, technologies, and communication methods.
  • Create and enforce structures for employees’ and volunteers’ interactions with consumers online or via text.  Prohibit private online communication to the extent possible. Use secure portals instead, where leadership can spot-check or monitor interactions. Text communications with minors should never be one-on-one, limiting opportunities for the privacy required for grooming or other inappropriate contact.
  • Communicate your policies clearly and publicly to ensure consumers and staff know the expectations for appropriate outside and online contact.
  • Ensure you have easily accessible anonymous reporting mechanisms for employees, volunteers, and consumers.

Online CSA is a complex and pervasive risk, and mitigation may seem challenging for any organization to tackle alone. Praesidium can help you take action. Contact us for support on policy creation, risk assessment, guidance for implementation, or additional resources.

 

 

 

References

  1. Finkelhor D, Turner H, Colburn D. (2024). The prevalence of child sexual abuse with online sexual abuse added. Child Abuse & Neglect (149:106634.) doi: 10.1016/j.chiabu.2024.106634. Epub 2024 Jan 15. PMID: 38227986.
  2. Bailey, S. (2024, June 25-26). We Cannot Arrest Our Way Out of the Problem: The Importance of a Whole System Approach to Tackling Child Sexual Abuse [Keynote]. Envision 2024: The Future of Prevention, Moore Center for the Prevention of Child Sexual Abuse, Washington D.C., US. https://publichealth.jhu.edu/moore-center-for-the-prevention-of-child-sexual-abuse/what-we-do/annual-envision-conference
  3. Finkelhor, Turner, Colburn (2024), p. 2.
  4. Jeglic, E.L; Winters, G.M. (2023). The Role of Technology in the Perpetration of Childhood Sexual Abuse: The Importance of Considering Both In-Person and Online Interactions. Children 2023, 10, 1306. https://doi.org/10.3390/children10081306 (emphasis added).
  5. Finkelhor, D., Turner, H., Colburn, D., Mitchell, K., & Mathews, B. (2023). Child sexual abuse images and youth produced images: The varieties of Image-based Sexual Exploitation and Abuse of Children. Child Abuse & Neglect, 143. 106269.