A Wisconsin software program engineer was arrested Monday for allegedly creating and distributing hundreds of AI-generated photos of kid sexual abuse materials (CSAM).
Courtroom paperwork describe Steven Anderegg as “extraordinarily technologically savvy” with a background in laptop science and “many years of software program engineering expertise.” Anderegg, 42, is accused of sending AI-generated photos of nude minors to a 15-year-old boy through Instagram DM. Anderegg was placed on the radar of legislation enforcement after the Nationwide Heart for Lacking and Exploited Kids flagged the messages, which he allegedly despatched in October 2023.
In line with info supplied by Instagram legislation enforcement, Anderegg posted an Instagram story in 2023 “consisting of a sensible GenAI picture of minors sporting BDSM-themed leather-based clothes” and inspired others to “try” what they had been lacking on Telegram. In non-public messages with different Instagram customers, Anderegg allegedly “mentioned his need to have intercourse with prepubescent boys” and informed one Instagram person that he had “tons” of different AI-generated CSAM photos on his Telegram.
It’s alleged that Anderegg began sending these photos to a different Instagram person after studying that he was solely 15 years previous. “When this minor made his age identified, the defendant didn’t reject him and didn’t inform him any extra. As an alternative, he wasted no time describing to this minor how GenAI creates sexually express photos and despatched the kid customized content material,” charging paperwork allege.
When legislation enforcement searched Anderegg's laptop, they discovered greater than 13,000 photos “with lots of — if not hundreds — of those photos depicting nude or semi-clothed prepubescent minors,” in response to prosecutors. Charging paperwork say Anderegg made the pictures on the Secure Diffusion text-to-image mannequin, a product created by Stability AI, and used “extremely particular and express directions to create these photos.” Anderegg additionally allegedly used “damaging cues” to keep away from creating photos depicting adults and used third-party Secure Diffusion dietary supplements that “specialised within the manufacturing of genitalia.”
Final month, a number of main tech firms, together with Google, Meta, OpenAI, Microsoft and Amazon, stated they might evaluate their AI coaching information for CSAM. Corporations have dedicated to a brand new set of rules that embrace “stress testing” fashions to make sure they don't create CSAM. Stability AI additionally signed the rules.
In line with prosecutors, this isn’t the primary time Anderegg has come into contact with legislation enforcement about his alleged possession of CSAM via a peer-to-peer community. In 2020, somebody utilizing the Web at Anderegg's dwelling in Wisconsin tried to obtain a number of identified CSAM recordsdata, prosecutors allege. Regulation enforcement searched his dwelling in 2020, and Anderegg admitted to having a peer-to-peer community on his laptop and steadily resetting his modem, however was not charged.
In a short arguing for Anderegg's remand, the federal government famous that he has labored as a software program engineer for greater than 20 years and his resume features a latest job at a startup the place he used his ” glorious technical understanding in formulating AI fashions”.
If convicted, Anderegg faces as much as 70 years in jail, though prosecutors say the “really helpful sentencing vary could also be as excessive as life in jail.”