On 21 September 2022, Bloomberg published a far-reaching report revealing how child molesters systematically stalk children on Twitch, Amazon's online streaming platform. According to an analysis of Twitch profile data, screenshots, and videos conducted by a researcher between October 2020 and August 2022, at least 1,976 adult users systematically found and followed young users who live-streamed on Twitch during that time. The findings of the study demonstrate that these adult users, each with more than 1,000 children in their follow list, followed a total of 27,916 children and manipulated them, convincing them to perform everything from obscene dances to explicit sexual acts on live broadcast. The report also cites the average daily number of children targeted by abusers on Twitch in July 2022 alone as 673.
Recently, following the criticism, Twitch announced some changes to the platform to increase the safety of young users. Stating that harassers are not welcome on the platform and will never be tolerated, Twitch stated that new practices have been introduced to ensure the safety of children on the platform, including mandatory phone verification to prevent children from opening accounts and a strengthened technology that detects and closes Twitch accounts belonging to people under the age of 13.
However, child abuse is far beyond being just a Twitch-specific problem.
How deep do the roots of child abuse go on the Internet?
According to a study conducted by the Internet Watch Foundation (IWF), the number of cases in which children between the ages of seven and ten were manipulated by adults to record moments of obscene behaviour online has increased by 2/3 in the last six months. Based on this, while the IWF received 12 thousand reports of child abuse in the same period last year, this number was 20 thousand in the first six months of 2022. Furthermore, the report states that this disturbing global ‘trend’ has increased by 360% since the first half of 2020, when quarantines due to Covid-19 were implemented.
UNICEF's official website reports that 80% of children in 25 countries feel at risk of online sexual abuse and exploitation .
The National Center for Missing & Exploited Children (NCMEC), to which social media networks are obliged to report child sexual abuse material (CSAM), states that nearly 70 million CSAM were reported to them in 2019, and that a large proportion of these materials circulate on social media networks, especially Facebook. However, NCMEC states that there has been a 15,000% increase in reported harassment cases over the last 15 years.
On the other hand, a survey of 40,000 school-age children conducted by the National Society for the Prevention of Cruelty to Children found that 25% of respondents had live-streamed with a stranger on a social media platform, while another survey conducted by the Canadian Centre for Child Protection found that 67% of CSAM victims said that the online dissemination of sexually explicit images of themselves affected them as much as physical abuse because the images were ‘permanent’.
A survey of 2,600 respondents found that 16% of respondents had experienced at least one form of sexual abuse before the age of 18, including non-consensual sexting, ‘grooming’, defined as a friendship between an adult/older person and a minor with the intention of a future emotional and/or sexual relationship, revenge porn, sexual harassment, and commercial sexual exploitation, which tended to occur when respondents were between the ages of 13 and 17. According to the report, girls and children who do not conform to gender norms are the most likely to be targeted by online sexual harassers, with 1 in 13 boys also falling victim to harassment. What’s worse is that, according to the report, 62% of the perpetrators' dating partners were friends and acquaintances, and 2/3 of the perpetrators were children under the age of 18.
When it comes to child abuse and social media, another social media platform that has recently come to the forefront along with Twitch is TikTok, which is contained within China-based ByteDance. The fact that TikTok has a younger user profile compared to other social media platforms makes it a great tool for abusers. In this regard, many investigations are being conducted against TikTok, especially in the USA.
According to the Financial Times on 15 April 2022, between 2019 and 2021, the number of investigations started by the US Department of Homeland Security aginst TikTok over child abuse increased 7-fold. NCMEC reports that TikTok reported approximately 155 thousand videos last year, while TikTok states that 96% of the content that violates the safety rules covering non-adult users is removed before anyone sees it.
TikTok had recently come to the fore by offering a large-scale cache of uncensored images of sexually abused children as a reference guide to the employees of third-party companies that manage and monitor the content of the platform. Accordingly, TikTok was using images of child abuse that had been removed from the platform to train its moderators. While it is not known how many people have access to these images, ‘Its training materials have “strict access controls and do not include visual examples of CSAM.’ said TikTok spokesperson Jamie Favazza, on the issue.
Another notorious social media account for child abuse is Instagram. Instagram, which reported approximately 3.4 million CSAMs to the NCEMC last year, was named the country's ‘leading’ platform for child abuse by the NSPCC, the UK's leading children's charity, in 2019. The study, which covers an 18-month period, found that there were over 5,000 recorded offences of ‘sexual contact with children’ on the platform during that time, and a 200% increase in the number of recorded cases of Instagram being used to target and harass children. Another detail stated in the report was that the target audience was mostly children between the ages of 12 and 15; however, in some cases, this number fell below the age of 11.
Stating that the data is ‘overwhelming proof that keeping children safe cannot be left to social networks’, Peter Wansless, CEO of the NSPCC, said that we cannot wait for the next tragedy before technology companies act, and that it is ‘vital’ for Instagram to more carefully design the basic protection service it offers to young people.
Precautions taken at all?
Of course, social media platforms take certain measures to prevent online child harassment; however, the main reason why social media platforms are the target of criticism at this point is that they insist on not taking enough precautions as if they allow children to be sexually harassed online. Although the rules of most platforms state that users under a certain age are prohibited from creating an account, we all know that it is not difficult to bypass this so-called security. All you have to do is lie about your date of birth or use a parent's phone for verification. I am sure that those of us who were children, especially in the first years when Facebook started to become popular, have been through these paths. At least I did, I admit it.
Recently, apart from Twitch, there was another tech giant that announced that it would increase security measures for children: Meta. Meta will automatically change the default privacy settings for child accounts on Facebook and will make changes to prevent ‘suspicious’ adults from messaging teens on Facebook and Instagram.
One of the toughest laws targeting sexual harassment of children on the Internet is preparing to come into force in the United Kingdom. We know that the Online Safety Bill, which is expected to be groundbreaking in this regard, will introduce new rules that will force social media and other user-generated content-based websites to remove ‘harmful content’ revolving around children from their platforms. Under the bill, all platforms will be obliged to scan and remove illegal content.
Who bears the greatest responsibility?
The continuous increase in the number of online child abuse, although the social media giants continuously take new measures and promise to continue to take new measures and make statements about how ‘abusive behaviour’ will never be tolerated on their platforms shows that this problem does not only start with social media platforms, but also shows that it will not end with social media platforms.
At this point, the biggest duty falls on us: Mothers, fathers, sisters and brothers. For this reason, it is very important to make sure that your child does not sign up for social media platforms that you know your child is not old enough for; if necessary, install some surveillance software on children's smartphones that shows how much time they spend on which application; set rules about who children can communicate with on the internet and how long they can communicate; talk to your child about online safety multiple times if necessary, and most importantly, if your child is sexually harassed online, never blame them and get help from an expert in the field.