The shadows of the digital age lurk a sinister menace, a chilling phantom that relentlessly preys upon the most vulnerable among us - our children. The National Human Rights Commission (NHRC) has now proposed guidelines to confront the malevolent proliferation of Child Sexual Abuse Material (CSAM) on the internet, a battle that tests the very essence of our humanity.
As we delve into this pressing concern, we uncover the stark reality of 450,207 reported cases of CSAM in 2023 alone, a figure that should send shivers down the spine of every conscientious soul.
The Silent Terror of CSAM
CSAM's relentless rise across the globe is nothing short of colossal. In 2021, more than 1,500 instances of publishing, storing, and transmitting CSAM were reported. Disturbingly, this abhorrent trend shows no signs of abating. The numbers speak for themselves: 450,207 cases in 2023, a significant surge from 204,056 in 2022, and 163,633 in 2021. The urgency of the matter cannot be overstated, and a multifaceted approach is essential.
This deeply disturbing trend has far-reaching consequences, etching indelible scars on the psyches of innocent victims. The NHRC has emphasized that the implications are profound and far-reaching, disrupting the overall development of the child, and leaving them traumatized and scarred for life.
A Four-Part Advisory to Combat the Darkness
The advisory proposes that each state establish a Specialised State Police Unit specifically for the identification and investigation of CSAM-related issues, as well as the apprehending of criminals and that the government establish a Specialised Central Police Unit to deal with CSAM-related matters.
“It (Specialized Central Police Unit) should consist of experts in identification and investigation of CSAM in order to focus on identifying and apprehending CSAM offenders both in the dark web and open web and developing a comprehensive and coordinated response of investigation and law enforcement agencies towards monitoring, detection, and investigation of CSAM,” the advisory read.
NHRC's four-part advisory doesn't merely highlight the issue but provides a blueprint for combating it effectively. One of the core recommendations pertains to terminology. The Protection of Children from Sexual Offences (POCSO) Act, 2012, is recommended to replace the outdated term 'child pornography' with 'Child Sexual Abuse Material (CSAM)'. The power of words is undeniable, and this change reflects the evolving nature of the problem.
Additionally, the advisory implores the government to redefine ' sexually explicit' in the IT Act, of 2000, ensuring the timely identification and removal of online CSAM. The need for harmonizing laws across jurisdictions in India for arrests and enhancing punishments is stressed. These legal reforms are crucial to strengthening the fight against CSAM.
1. Redefining Terminology
The NHRC recommends a paradigm shift in our approach. They suggest that the term 'child pornography' be consigned to history and replaced with the more apt 'Child Sexual Abuse Material (CSAM).' This alteration is not merely semantic but reflects the gravity of the issue.
2. Legal Changes and Harmonization
The Centre has been implored to harmonize laws across jurisdictions, ensuring that arrests are made promptly. It is imperative that the legislative framework matches the gravity of these offences, strengthening punishments to deter potential wrongdoers.
3. Detection and Investigation
The NHRC has recommended the establishment of Specialized State Police Units across all states for the detection and investigation of CSAM cases. Furthermore, the creation of a Specialized Central Police Unit at the national level, staffed with experts, is crucial for tracking down CSAM offenders, whether they hide in the dark web or the open web.
4. Data Collection and Technology
The NHRC encourages the formation of a national database of CSAM to gather data on trends, prevalence, and other parameters. This invaluable repository can guide future interventions. Additionally, technology, including hotspot mapping and predictive policing, must be employed to identify repeat offenders.
Sensitization, Awareness, and Victim Support
For a holistic approach, the advisory emphasizes sensitization at various levels. Prosecutors, judges, and police officials should undergo training on children's rights in the digital environment, recognizing the extent and manifestations of CSAM, and implementing child-friendly procedures during investigations. The involvement of schools, colleges, and other institutions in raising awareness about online child abuse is highly recommended.
Victims of CSAM need psycho-social care centres, a place where they can begin to heal. These centres are pivotal in offering support to those who have endured the horrors of CSAM.
Regulating the Internet: A Collective Responsibility
To root out CSAM from the internet, the NHRC has made recommendations for internet intermediaries, including social media platforms, OTT applications, and cloud service providers. The establishment of a CSAM-specific policy and the proactive use of technology, such as content moderation algorithms, are urged to detect and remove CSAM from their platforms.
Collaboration is key, and the media platforms are encouraged to expedite the removal of CSAM content. Partnerships between themselves and the government are vital for the real-time sharing of information concerning CSAM content on the Internet
The battle against CSAM is not the sole responsibility of the NHRC, the government, or law enforcement agencies. It is a collective responsibility, a societal duty to protect the most vulnerable amongst us - our children. We must heed the NHRC's advisory, for it is a roadmap to confront this chilling phantom and safeguard the innocence of our future.
Suggested reading: Rape Culture: A Complex Problem That Mere Death Penalty Can't Solve