0%

The number of child sexual abuse materials hosted online in the Republic of Moldova is on the rise.

The number of child sexual abuse materials hosted online in the Republic of Moldova is on the rise.

The Number of Reports Concerning the Hosting of Child Sexual Abuse Material Online in the Republic of Moldova Continues to Rise

Throughout 2024, the SigurOnline Hotline, the national reporting service for child sexual abuse material (CSAM), recorded 11,894 reports, marking a nearly fivefold increase compared to 2023. Additionally, last year, 122 criminal cases were registered involving sexual offences against children committed through the use of information technologies, compared to 72 cases in 2023.

Iurie Roșca, Head of the Cybercrime Investigation Centre of the National Investigation Inspectorate under the General Police Inspectorate (IGP), believes that the Republic of Moldova remains an attractive destination for the hosting of illegal online content, including CSAM. This situation is facilitated by the existence of a well-developed and accessible digital infrastructure. According to Roșca, in 2024 Moldova ranked 15th globally in terms of the number of such materials hosted, with 19,593 files identified on servers located in the country.

Elena Botezatu, Executive Director of the International Center La Strada, noted that 2024 was a year of both challenges and progress. "There is an unseen reality online, a reality of abuse. The Internet remains a risky space for children, but also one where, through joint efforts, lives can be saved. Every image taken down is a step toward safety. Every report submitted is a refusal to normalize abuse. We need clearer legislation, ongoing engagement, and stronger interinstitutional collaboration to ensure that child protection in the digital space becomes a daily reality, not just an ideal"  Botezatu emphasized.

Last year, the reporting service issued 10,949 notifications requesting the removal of CSAM from hosting providers. Following these notices, hosting providers blocked access to or removed 29,620 files containing abusive content. Over half of the removed materials consisted of entire web pages displaying from a few to hundreds of media files (photos, screenshots, videos, etc.).

According to the 2024 annual report of the SigurOnline Hotline, 68% of all reported materials involved girls—accounting for 7,623 files. Another 33 materials involved boys, while 147 materials featured both boys and girls shown in sexual activities or in contexts indicative of sexual abuse.

This trend is consistent with 2023 data and highlights a persistent and concerning typology in online CSAM: girls remain disproportionately vulnerable to this form of exploitation.

In 2024, 7,336 reported files involved children aged 3 to 13, with 4 cases involving children under the age of 2. Moreover, in over 60% of reports, the children depicted were between 7 and 10 years old, making this age group the most frequently reported in CSAM.

The global expansion of artificial intelligence and its use in the commission of sexual abuse against children has reached a new dimension. AI-generated CSAM has advanced to the point where analysts at reporting services are now encountering highly realistic videos that are extremely difficult to distinguish from real footage.

The report notes that the Hotline received reports of 62 links containing virtual material—87% consisting of comics and 13% AI-generated images. These included over 10,000 images and videos created with artificial intelligence or anime-style systems. Under current national legislation, these materials are not classified as illegal and therefore cannot be removed from online platforms.

At the same time, 69% of reported materials were identified as self-generated content, a highly alarming trend that represents a significant risk to children. These self-produced images or videos often involve children who have been manipulated, coerced, or blackmailed into creating and sharing explicit content. This not only endangers their current safety and integrity but also increases the risk of further exploitation.

During the reporting period, one-third of reports concerned illegal content hosted on commercial websites. These platforms are used to generate revenue by selling access to CSAM through download fees or subscription-based models offering access to illegal collections.

The full report is available HERE, and a summary of the key data in figures can be accessed HERE.

It is worth noting that www.siguronline.md is an international service that operates according to the Hotline standards for reporting CSAM. The SigurOnline Hotline is accredited by INHOPE, the international network uniting 57 hotlines from 52 countries worldwide. The service is managed by the International Center La Strada in partnership with the General Police Inspectorate of the Republic of Moldova and private sector ICT companies. Its mission includes the identification, referral, notification, removal, and prevention of the redistribution of CSAM online.

In 2024, the Hotline’s operation was made possible through the support of UNICEF Moldova, the International Organization for Migration (IOM), and the Embassy of the United States in Chișinău. These partners have contributed to strengthening efforts to prevent and combat the sexual exploitation of children online, by enhancing institutional capacity, supporting professionals, and facilitating interinstitutional cooperation.