AI generated child sexual abuse

AI generated child sexual abuse material and the POCSO Act

Introduction:

A sub-set of a larger discourse on Ethics and AI is AI generated child sexual abuse material. There is a disturbing but growing trend of AI being used to create child sexual abuse material. Though such material may not include real children, as images are AI generated, the existence of technology that allows for such untethered and unregulated access does raise a question as to what is illegal and what is immoral.

Recent case:

In a pioneering metaverse-related development earlier this week, British authorities were looking into an alleged gang rape case within a virtual reality game. The Daily Mail UK reported that online strangers attacked the digital character or avatar of a 16-year-old girl. The UK law enforcement is currently examining this unprecedented gang rape case in the metaverse. The girl is reported to be deeply upset after her digital character, known as her avatar, was subjected to gang rape by unknown individuals online. Fortunately, the victim, who was wearing a headset, did not sustain any physical injuries as the assault occurred in the digital realm. The UK authorities are now urging the enactment of laws to address a surge in sexual offenses within this domain, asserting that law enforcement strategies need to adapt to prevent individuals with malicious intent from exploiting children through new technologies.

Laws in India:

While UK authorities are urging the government to enact laws regulating this issue, let’s examine if there are any existing laws in India to safeguard children when their personal boundaries are shamelessly violated in the metaverse, does our law punish any person for ‘using a child’ for pornographic purposes or storing pornography that ‘involves a child’ does it include a child that may not exist?

The answer seems to lie in the definition section of the POCSO Act wherein “child pornography” is defined to include any visual depiction of sexually explicit conduct involving a child which include photograph, video, digital or computer-generated image indistinguishable from an actual child and image created, adapted, or modified, but appear to depict a child.

As per the 2019 Amendment to the POCSO Act- “Section 2 (1) (da) “child pornography” means any visual depiction of sexually explicit conduct involving a child which include photograph, video, digital or computer-generated image indistinguishable from an actual child, and image created, adapted, or modified, but appear to depict a child.”

This comprehensive and admirably foresighted definition, may help regulate child sexual abuse material in our country, bridging the gap between immorality and illegality.

Conclusion:

The alarming emergence of AI-generated child sexual abuse material has become a focal point in the broader discourse on Ethics and AI, this underscores the pressing need for legal frameworks to address such offenses.

The British authorities’ call for legislative action in response to the metaverse incident has prompted a critical examination of India’s legal landscape. While the UK grapples with adapting laws to combat the exploitation of children through new technologies, India’s existing legal framework, particularly the 2019 Amendment to the POCSO Act, presents a comprehensive definition of “child pornography.” This definition, encompassing visually explicit depictions involving a child, whether real or computer-generated, showcases a forward-thinking approach that may serve as a model for addressing the intersection of immorality and illegality in the realm of AI-generated child sexual abuse material.

Comments are closed.