The digital age
The rise of social media has transformed how we interact with our digital environment. The primary screen we interact with is no longer the television but the smartphone. We are no longer constrained to broadcast programming; we have access to virtually anything we want at a touch of a screen. These new digital boundaries have increased our quality of life in the information we can access but has also brought dangers. Everything we do online is recorded and fed into algorithms that continually grab our attention while online social media networks show us unrealistic portrayals of ‘perfect’ lives. Although we were able to virtually connect with each other during COVID through digital platforms, misinformation around medical advice became ripe. No demographic is more at risk than young people.
Adolescents, being in that period of life when they are rapidly developing, are vulnerable to online dangers. Research suggests a link between symptoms of depression, poor sleep, lack of physical activity, and poorer educational outcomes for children with ‘higher screen times’ (more than two hours a day). Online autonomy is often unaddressed by parents, leaving them vulnerable to nefarious actors.
The rise of technology facilitated sexual assault
Technology facilitated sexual assault (TFSA) is becoming an issue as children become digitally literate and begin interacting with one another online at a younger age. TFSA is the face-to-face sexual assault of children and adolescents occurring after connecting with an alleged offender online. Examples of TFSA include blackmailing or bribing victims into offline sex or organising an offline meeting with the intent to sexually assault the victim.
The increase in TFSA has coincided with the normalisation of online sexual interactions among young people. A 2020 United States study on 2,002 children from ages 9-17 years discovered that 40% of children viewed sharing of “nudes” as normal behaviour. 1 in 3 children has reported having had an online sexual interaction, such as sending or receiving a nude photo, video, or sexual message. Most children reported experiencing an online sexual interaction on platforms such as Snapchat, Instagram, Facebook. These were found to be more prevalent in the LGBTQ+ community. As children become exposed to sexual content, they are both more likely to be at risk of and perpetrate TFSA.
A 14-year retrospective audit conducted by the Victorian Forensic Paediatric Medical Service (VFPMS) found that TFSA comprised a higher proportion of sexual assault cases. Sexual assaults that were technology-facilitated increased from 4% (2007 – 2013) to 14% (2014 – 2020). In 2019 and 2020, almost one in five reported cases of sexual assault in young people were technology-facilitated. Most victims were female, with an average age of 15. The majority of TFSAs occurred at the first face-to-face meeting, following a period of online communication. Some involved less than 24 hours of chat and most involved weeks. In one-third of TFSA cases, the assault occurred at the alleged offender’s residence and in another third, the TFSA occurred in a public space (public parks, public toilets). The offenders varied in age, from similar ages to a maximum of 26 years older. The overwhelming majority of sexual assault allegations are medically high-risk assaults – risk of sexually transmitted infections and unwanted pregnancy – involving penile penetration (vaginal, oral, and/or anal). No condom was used in at least three-quarters of cases.
The technology platforms associated with these TFSA incidences have changed over the years, likely related to their popularity. Facebook was the predominant platform initially. From 2014 onwards, dating apps began to feature. Most dating apps are age-restricted (18 and over). From 2017 to 2020, most TFSA cases were associated with Snapchat, an app available to children 13 years and over that allows time-limited image sharing. Nevertheless, given that many sexual assaults go unreported, this data likely only represents the tip of the iceberg.
The ‘perfect storm’ theory
The findings of the VFPMS study led to the ‘perfect storm’ theory. The nature of the online environment makes TFSA a more likely occurrence.
Online environments can induce an inhibition effect in children. They are more open than they might in person, resulting in a hastening of relationship progression, and failure to see the other person as a real-life stranger. Offenders take advantage of the relative anonymity afforded allowing them to build trust under false pretences enabling deception. This creates the perfect conditions for children to come into contact with offenders. Underreporting just masks the problem. Children who experience online sexual interactions are often embarrassed and are afraid of reporting it. The offender then does not face consequences due to these ‘reporting barriers’ and are free to continue offending.
Unwanted sexual interactions have negative consequences on children. The combination of coercion, blackmail and self-blame, fear and anxiety perpetuates the trauma experienced. This may lead to self-harm, suicidal ideation, and long-term issues including trust difficulties, distorted body image, and traumatic legal processes. There need to be stronger safeguards to minimise unwanted sexual harassment online, especially for children and adolescents.
Issues with current methods to counter the problem
Many methods exist to counter the growing issue of digital sexual harassment and abuse but often they aren’t enough. Most children prefer using online safety tools (block, report, mute or delete) than offline support systems (parents, peers, healthcare providers) to deal with aggressors and unwanted contact. The United States study found that 83% of children used block, report, or mute in response to unwanted online sexual interactions but only 37% of individuals told a parent, trusted adult, or peer about it. When using online safety tools, although many more children choose to block than report, a United Kingdom study found that only 17% of children reported unwanted sexual content to online social media platforms. Children viewed blocking as a faster way to protect themselves. Reporting takes more time, forcing children to dwell on the interaction. There is a widespread view that reporting offenders is futile. A third of children did not trust existing complaints processes. Simple ramifications for offenders can take up to a week to come into effect, if there is any action taken at all by online platforms, while offenders can still prey on vulnerable children, creating the perception that reporting has limited effect.
Embarrassment and shame are the primary drivers stopping children from engaging their offline support systems. Children fear anger and disappointment from their parents and attempt to hide their online interactions. Concerningly, some children do not view what they had experienced as ‘rape’ as they willingly engage with the offender. LGBTQ+ communities are at a heightened risk due to perceived societal disapproval.
Nevertheless, social media platforms themselves are the first line of protection. They need to provide robust support systems and information on staying safe from online sexual interactions. Platforms should improve their age verification and identity checking options to prevent young children from creating accounts. Young children and adolescents who use these online platforms need to be provided with safety information on how and when they should block and report sexual aggressors. Social media companies should make both blocking and reporting processes easy, accessible, and transparent so that young individuals understand the process, how long it takes, and the final outcome of their report so they feel that their report made a difference. While social media should aim to improve their processes, the responsibility lies with the community to help our children.
How to educate and reframe stranger danger
Parents, schools, and healthcare providers need to establish communication patterns where children see them as a viable offline support system for online sexual interactions. It is essential for parents to establish a climate of trust and have ongoing discussions with their children about the benefits and risks associated with online activities. Preventing online sexual abuse is a matter of awareness and response: knowing what children are doing and how they are vulnerable, then helping them learn to respond when their well-being is threatened online. It is important to explain parental expectations and online safety and involve children in setting rules to ensure understanding and instil empowerment and autonomy. Parents need to stay non-judgmental, understand the children’s motivations and make sure the children know that abuse is never their fault. There is a need for primary and high school to provide education to young people reframing online stranger danger and provide strategies to combat it, such as meeting an online contact in a public place and telling someone when and where they are meeting.
Healthcare providers should improve their knowledge of popular platforms such as TikTok, Snapchat and Instagram. They can gain a better understanding by using and exploring the apps themselves to increase health literacy. In addition to the routine adolescent HEEADSSS screen, healthcare providers need to normalise and become comfortable asking young people about online platforms, their pattern of usage, and the online friends they come across. It is also critical that we understand the risks for vulnerable groups such as the LGBTQ+ community.
Together, we must all promote the narrative that even brief online interactions are still with real-life strangers. Rather than focussing on quantifying and limiting screen time in children, we should encourage healthy interactions with online networks instead. As we adapt to the ‘new normal’ of universal social media, we must equip children with the strategies that allow them to engage and connect with others in a protected and safe manner.
Many thanks to Janine Rowse and Jo Tully for access to their amazing work on this topic.
Adams R. Three in four girls have been sent sexual images via apps, report finds [Internet]. Australia: The Guardian; 2021. Available from: https://www.theguardian.com/media/2021/dec/06/three-in-four-girls-have-been-sent-sexual-images-via-apps-report-finds.
Center for Humane Technology. Data sheet on minors and social media [Internet]. 2021. Available from: https://assets.website-files.com/5f0e1294f002b15080e1f2ff/60ca7f5b7bfa4d1a843e627c_CHT%20Research%20on%20Harm%20to%20Minors%20Fact%20Sheet%20-%20Google%20Docs.pdf.
Hamilton-Giachritsis C. Everyone deserves to be happy and safe: a mixed methods study exploring how online and offline child sexual abuse impact young people and how professionals respond to it. NSPCC; 2017.
Thorn research and Benenson Strategy Group. Responding to online threats: minors’ perspectives on disclosing, reporting and blocking [Internet]. United States: Thorn; 2021. Available from: https://info.thorn.org/hubfs/Research/Responding%20to%20Online%20Threats_2021-Full-Report.pdf.
Tully J, Rowse J and Bassed R. Dark new frontier: in the online world, child sexual abuse is taking hold [Internet]. Australia: Monash Lens; 2021. Available from: https://www.monash.edu/medicine/news/latest/2021-articles/dark-new-frontier-in-the-online-world-child-sexual-abuse-is-taking-hold