Distribution of Child Sexual Abuse Material continues unchecked across Whatsapp & Telegram during the lockdown 

Share this News:

India, October 9, 2020: As per a report by CyberPeace Foundation (CPF), an award-winning leading civil society organization and think tank of cybersecurity and policy experts end to end encrypted chat platforms like WhatsApp and Telegram the distribution of Child Sexual Abuse Material (CSAM) continues unchecked on these platforms owing to the high level of anonymity and discretion offered by the platforms. In the study, we found over 100 instances of dissemination of CSAM in just 29 randomly selected adult pornography groups on WhatsApp, while 23 such instances of dissemination from 283 channels analysed on Telegram. 

Cyber peace Foundation undertook this study, between June 2020 to July 2020 globally, to assess how end-to-end encrypted platforms have emerged as a new mode of CSAM distribution. The objective of the research was, to specifically study and analyze technical, legal and policy frameworks that can help prevent the proliferation of CSAM on E2EE communication services. Most countries/commentators have taken an either-or approach to propose solutions, where it is either monitoring of content and invasion of user privacy or no regulation and free flow of objectionable content at alarming levels. 

The key findings of the report are:

  • Cyber peace foundation found that a substantial number of countries have chosen to regulate encryption-based services and implement a license-based regime, but there is not much literature available that talks about the efficacy of such a regime in checking CSAM proliferation over the internet. 
  • On WhatsApp, CyberPeace Foundation found the following
    • Another round of investigation was conducted while drawing on findings and reports made to the platform in 2019. In early June 2020, it was found that 110 out of the total 182 WhatsApp group links that were shared with the platform in 2019 were still active and operating. Many of these groups that were directly reported to the platform either comprised of adult pornography groups or had group icons that were pornographic. Almost all groups that had a clear CSAM group icon or description were removed while groups with obscene pictures and names like “sister’s rape”, “only rape”, “virgin sister’s rape” etc. remain active. 
    • Out of the total 1299 groups for which data was collected, 215 groups were no longer active (old date of posting the invite links). 
    • Out of the remaining 1084 groups, 565 groups had obscene, pornographic and derogatory group names or group icons. 
    • It was found that there were several group names in languages and scripts other than English, including Hindi, Punjabi, Tamil, Spanish, Sinhala etc. 
    • There were several groups for distributing rape videos (named as such) and several that solicited sex work. 
    • Out of the total sample size, only three (3) groups had clear CSAM in the group icon. 
    • 39 other groups had normal-seeming pictures of children with obscene group names/descriptions. 
  • On Telegram, CyberPeace Foundation found the following
    • After observing the content being shared on these channels for over a week, it became manifest that the perpetrators had devised a standard modus operandi for distribution of CSAM on Telegram, which enabled them to avoid detection which is exacerbated due to no in-app reporting feature for individual accounts. This included solicitation on adult pornography groups by using names or other indicators to hint CSAM and the intention to distribute, subject to payments. 
    • Initially, 350 publicly available links for Telegram channels were gathered. Out of these, nineteen (19) links did not exist, and 48 channels had already been deactivated as they were used to spread pornographic content. 
    • The remaining 283 channels had obscene and sexually explicit channel names, icons, and descriptions. Channel names were in English, Hindi, Tamil, Telugu, Spanish, and Russian. 
    • On these channels, the content could be easily classified into the following heads: 
      • Solicitation of sex work; 
      • Sexually explicit content involving adults, produced by a well-known production house; 
      • Self-recorded videos; 
      • Thumbnails of individuals appearing to be minors with links to other channels and users; and 
      • Fictional sexually explicit content. 

Talking about the report, Vineet Kumar, Founder and President, CyberPeace Foundations, said, “It’s extremely worrying that despite the claims by end to end encrypted platforms regarding safeguarding the cyber wellbeing of their users, the proliferation of CSAM continues unchecked and unhindered. With end to end, encryption is an important measure to have for security; it also allows malicious elements to have a cloak of invulnerability from law enforcement officials. As per our estimates, the distribution of CSAM material during the lockdown has increased during the lockdown. We hope this report will draw attention to this critical problem and help layout steps to curb the same such as implementing a national tip line to report CSAM content or creating a domestic hash register.”

To read the entire report please visit: https://www.cyberpeace.org/publication/#1601317614298-ca2aa3e0-a2cd