0%

Virtual Child Sexual Abuse Materials: A New Challenge in Online Child Protection

Virtual Child Sexual Abuse Materials: A New Challenge in Online Child Protection

In the first half of 2024, analysts from the Internet Watch Foundation, the largest child sexual abuse material (CSAM) reporting service in the world, observed a 6% increase in reports generated by artificial intelligence (AI) compared to the previous 12 months. The majority (99%) of these materials are accessible on public networks, making them easily accessible to anyone.

The expansion of artificial intelligence and the use of technologies in committing child sexual abuse has gained new momentum globally. AI-generated child sexual abuse images have advanced to the point where analysts are now receiving video images that are realistic and nearly indistinguishable from real ones.

"One of the most disturbing materials we've analyzed since the launch of the reporting service on www.siguronline.md was a photo file of a 2-year-old child being sexually abused by an adult through direct sexual intercourse. This photo was AI-generated, but the content appeared so realistic that it was extremely difficult to determine whether it was a real or virtual image. Child abuse is abuse in any case, even if the image is created using technology. The most shocking part is realizing how perverse the imagination of the person who created such images can be and how easily they can be accessed online," says Victoria Gribineț, an analyst at www.siguronline.md.

Virtual child sexual abuse materials represent an alarming trend in child protection in the online environment for several reasons:

  • They are typically more severe, involving younger children engaged in sexual activities, including humiliating sexual acts, physical violence, bodily harm, or torture.
  • The technologies used to generate online abuse materials can be downloaded from the internet and used offline, making it difficult to trace perpetrators.
  • AI-generated abuse materials can re-victimize children, as many of the photo or video images are created based on the real identities of the children.

In Moldova, between 2023 and 2024, 79 links to websites or virtual child sexual abuse materials were reported, with 25% of them being AI-generated, while the rest referred to comic/animated materials depicting sexual abuse. These included more than 10,000 AI-generated images or videos, or anime-style content.

Elena Botezatu, Executive Director of the International Center "La Strada," emphasizes that in 26 European Union countries, digital child sexual abuse materials generated by AI or other technologies are illegal. "This indicates a different perspective on how child pornography crimes are perceived in these countries. It doesn't matter whether real children are depicted or if the materials are created by technologies. What matters is that those who produce, distribute, or view such images have a sexual interest in children, which is illegal. The legality of these materials remains a topic of contradictory discussions in Moldova, where child sexual abuse materials generated by technologies are not yet criminalized. The lack of uniform legislation creates challenges in reporting and removing abuse materials from the internet, making the online space in our country conducive to the expansion of online abuse," says Elena Botezatu.

According to Iurie Roșca, head of the Cybercrime Center of the National Investigation Inspectorate (INI), Moldova’s EU accession agenda for 2024-2027 regarding online child protection includes the adoption of a comprehensive list of EU directives and regulations aimed at strengthening the legal framework, infrastructure, capacities, and preventive and combat measures against online child sexual abuse and exploitation.

"At the same time, the authorities are working on a bill to prevent and combat the sexual exploitation and abuse of children, which will include a definition of child sexual abuse materials. The new regulations will cover realistic images of a child engaged in explicit sexual behavior or images of a child's sexual organs, primarily for sexual purposes. Although the new regulations in the draft law will attempt to cover such 'realistic' images created using specialized technologies, a major challenge is that not all of these images appear real. Some may be stylized, abstract, or close to realism, but without depicting a real child, which complicates the process of detection, classification, and the application of legal sanctions," says Iurie Roșca.

It is worth mentioning that on April 25, 2023, Moldova joined the global INHOPE network - the International Association of Internet Hotlines, which brings together 52 hotline services reporting child sexual abuse materials worldwide. The national reporting mechanism for child sexual abuse materials, www.siguronline.md, is implemented by the International Center "La Strada" in partnership with the General Police Inspectorate of Moldova and was developed with the support of the U.S. Embassy in Chișinău. Since its launch, 11,875 reports of child sexual abuse materials have been received on the SigurOnline Hotline platform. In total, 22,622 illegal materials have been removed, 98% of which contained images (photos and videos) representing girls. Nine out of ten children were under the age of 10.

In 2015, the Council of Europe’s Committee of Ministers designated November 18 as European Day for the Protection of Children against Sexual Exploitation and Sexual Abuse. The aim of the day is to emphasize that child sexual exploitation and abuse remain a tragic reality, and that parents, teachers, civil society, and authorities must take urgent measures for the safety and protection of children.