The US Department of Justice (DOJ) made headlines last week by arresting a Wisconsin man for generating and distributing AI-generated child sexual abuse material (CSAM). This landmark case marks the first instance where AI technology was used to create illegal material, setting a significant judicial precedent.
The DOJ aims to establish that such exploitative content is still illegal, even if no actual children were involved in its creation. “Put simply, CSAM generated by AI is still CSAM,” stated Deputy Attorney General Lisa Monaco in a press release.
The accused, 42-year-old software engineer Steven Anderegg from Holmen, WI, allegedly used a modified version of the open-source AI image generator, Stable Diffusion, to produce the images. He then reportedly used these images to attempt to lure an underage boy into sexual situations. This aspect of the case is expected to be crucial in the forthcoming trial, where Anderegg faces four counts of producing, distributing, and possessing obscene visual depictions of minors and transferring obscene material to a minor under 16.
The DOJ’s charges detail that Anderegg created images depicting “nude or partially clothed minors lasciviously displaying or touching their genitals or engaging in sexual intercourse with men.” He used specific prompts, including negative prompts (instructions for the AI model on what to avoid producing), to generate these explicit images.
While cloud-based image generators like Midjourney and DALL-E 3 have measures to prevent such misuse, Anderegg allegedly used Stable Diffusion 1.5, a variant known for fewer restrictions, reportedly produced by Runway ML. Stability AI confirmed this version’s origin to Ars Technica.
The DOJ also revealed that Anderegg communicated online with a 15-year-old boy, explaining his use of the AI model to create the images. He allegedly sent the teen direct messages on Instagram, including several AI-generated images of “minors lasciviously displaying their genitals.” Instagram reported these images to the National Center for Missing and Exploited Children (NCMEC), alerting law enforcement.
Anderegg could face five to 70 years in prison if convicted on all four counts. He is in federal custody, awaiting a hearing scheduled for May 22.
This case challenges the belief that CSAM’s illegality is tied solely to the exploitation of real children in its creation. Even though AI-generated CSAM does not involve real human subjects, it could still normalize and encourage producing and distributing such material, potentially leading to more predatory behaviors. The DOJ’s stance aims to clarify this as AI technology continues to evolve and become more accessible.
“Technology may change, but our commitment to protecting children will not,” Deputy AG Monaco emphasized. “The Justice Department will aggressively pursue those who produce and distribute child sexual abuse material—no matter how that material was created. Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children.”