Skip to main content

Double Your Impact to Protect Children

Every day, NCMEC works tirelessly to protect children and prevent abuse—but we can’t do it alone. Give today to help continue this vital work, and your donation will be matched, dollar for dollar, through January 1, 2025, up to $100,000.

Generative AI CSAM is CSAM

03-11-2024

Generative AI (GAI) has taken the world by storm. While it is undeniable that this innovative technology has benefits, the National Center for Missing & Exploited Children (NCMEC) is deeply concerned about the dangers that GAI poses to the global fight for child safety. 

In 2023, NCMEC’s CyberTipline received 4,700 reports related to Child Sexual Abuse Material (CSAM) or sexually exploitative content that involved GAI technology (GAI CSAM). GAI CSAM portrays computer-generated children in graphic sexual acts and can be generated at will by the user of certain GAI platforms. GAI can also be utilized to create deepfake sexually explicit images and videos by using an innocent photograph of a real child to create a computer-generated one.

text of GAI searches
text of GAI searches

Real prompts used to create GAI CSAM.

As if the creation of this imagery wasn’t terrible enough, NCMEC also has received reports where bad actors have tried to use this illegal GAI content to extort a child or their family for financial means. Furthermore, users of the technology to create this material have used the argument that, “At least I didn’t hurt a real child” and “It’s not actually a child...”

GAI CSAM is CSAM. The creation and circulation of GAI CSAM is harmful and illegal. Even the images that do not depict a real child put a strain on law enforcement resources and impede identification of real child victims. For the children seen in deepfakes and their families, it is devastating. We must continue providing support to children who are victims of explicit imagery online regardless of how the imagery was created, ensuring that laws adequately protect child victims, and implementing regulations that require GAI platforms to incorporate child safety by design concepts as they create these tools. Protecting children from the harm of GAI CSAM also requires education and guidance from trusted adults. We have an opportunity with this relatively new technology to guide youth, so they learn to use GAI safely and understand the dangerous implications of misusing GAI to create sexually explicit or nude images of other minors.

It is essential that federal and state laws be updated to clarify that GAI CSAM is illegal and children victimized by sexually exploitative and nude images created by GAI technology have civil remedies to protect themselves from further harm. Additionally, legislation and regulation is needed to ensure that GAI technology is not trained on child sexual exploitation content, is taught not to create such content, and that GAI platforms are required to detect, report, and remove attempts to create child sexual exploitation content and held responsible for creation of this content using their tools.

The ethical and legal conversations around the governance of GAI technology and its ability to generate CSAM is just beginning. We call on GAI technology creators, legislators and child serving professionals to come together and find a way to prioritize child safety while this innovative technology continues to evolve.

On March 12, John Shehan, NCMEC's Senior Vice President, Exploited Children Division & International Engagement, testified before the United States House Committee on Oversight and Accountability Subcommittee on Cybersecurity, Information Technology, and Government Innovation to discuss the trends that NCMEC is seeing. Read his full testimony, “Addressing Real Harm Done by Deepfakes,” here.