This Wednesday, the European Union Commission took a significant step in addressing the pressing issue of child sexual abuse by adopting a proposal to update criminal law rules. The proposed amendments specifically target the sexual abuse of minors and sexual exploitation, aiming to criminalize artificial intelligence (AI)-generated imagery and deepfakes depicting child sexual abuse (CSA). This legislative move is part of a comprehensive package focused on preventing CSA, raising online risk awareness, and facilitating crime reporting for victims.
Key Highlights of the Proposal:
- Criminalizing AI-Generated Imagery and Deepfakes: The legislation explicitly prohibits and penalizes the use of AI to produce material depicting child sexual abuse.
- Timely Removal of Content: Online platforms will be required to remove child sexual abuse material within one hour of being notified by law enforcement.
- New Offenses: The proposal introduces criminal offenses for the possession and exchange of “pedophile manuals” and live-streaming child sexual abuse.
- Extended Reporting Period: The update aims to establish a longer reporting period for victims, allowing them more time to report sexual abuse and take legal action against perpetrators.
- Victim Rights: Victims will have the right to financial compensation to address the long-term harm caused by child sexual abuse.
- Mandatory Reporting: Professionals working closely with children will be mandated to report crimes, addressing a significant challenge in efforts to combat child sexual abuse.
Rationale for the Proposal:
The current rules on this matter in the EU date back to 2011, and the rapid technological advancements and increased online presence of children necessitate this update.
The prevalence of online sexual offenses against children is a growing concern, according to the European Union, one in five children suffers some form of sexual violence, both online and offline. It states that the Internet has significantly aggravated the spread of child sexual abuse, as a result of which perpetrators can gather online and instantly share videos and images of serious sexual violence against children, often very young children.
The proposal aims to reinforce prosecution and prevention. It will establish a longer period during which victims will be able to report the sexual abuse they have suffered and take action against the perpetrator of the crime. The update will also grant victims the right to financial compensation to address the long-term harm caused by child sexual abuse.
The legislation will require online platforms to remove child sexual abuse material within one hour of being notified by law enforcement. Additionally, the use of AI to produce such material will be explicitly prohibited and punishable by law.
The proposal also seeks to criminalize the possession and exchange of “pedophile manuals” and create a new criminal offense for live-streaming child sexual abuse.
The European Commission has emphasized the importance of these measures in protecting children and preventing the spread of harmful content online. The final form of the proposals will be decided by the European Parliament and the European Council.
Reporting a crime will also be mandatory, at least by professionals who work closely with children, in order to address a major challenge in efforts to end child sexual abuse.
Global Impact:
Beyond the EU, the issue of child sexual abuse material (CSAM) has global ramifications. In the United States, a national survey study revealed alarming statistics, with a lifetime prevalence of online child sexual abuse at 15.6%. The Internet Watch Foundation reported a 35% increase in online material containing child sexual abuse between 2016 and 2017.
The statistics reveal a significant increase in reports of suspected child sexual abuse material online, with a 30% jump within a year on average. The US alone accounted for 30% of the global total of CSAM URLs at the end of 2021.
Experts emphasize the devastating impact of CSAM on survivors, with girls often experiencing suicidal thoughts after exposure to such material. The severity of abuse is reflected in the fact that 84% of the images contain explicit sexual activity.
It’s crucial to acknowledge that sexual abuse is one of the most under-reported crimes globally, with a general reluctance to discuss it. This reluctance means that the statistics likely only portray a fraction of the true extent of this epidemic.
In light of these alarming statistics, the rise of CSAM online is undeniably a global issue that demands urgent and comprehensive action. The proposed regulations by the European Union to criminalize child sexual abuse and the use of AI to generate related content are pivotal steps in addressing this pressing concern. The impact of such measures, if implemented effectively, could play a crucial role in combating the proliferation of CSAM and protecting children from online harm.
Furthermore, recent legislative initiatives, such as the bill passed by California lawmakers, underscore the commitment to combatting child sexual abuse material on social media. The bill imposes significant fines on social media companies for failing to promptly remove such material, emphasizing the importance of proactive measures and reporting mechanisms to identify and remove harmful content.