We can’t believe we have to say this, but this has now come up enough times for us to find it necessary to explain: AI-CSAM is in fact actively harmful to real people and children, and is in no way a “victimless crime.” AI’s ability to produce functionally infinite images powered by datasets containing millions of photographs of real people, including children, and real images of real CSAM, enshrines and perpetuates that abuse in a way that was previously impossible.
Cole, S., & Maiberg, E. (2024, May 28). AI-Generated Child Sexual Abuse Material Is Not a ‘Victimless Crime’. 404 Media. https://www.404media.co/ai-generated-child-sexual-abuse-material-is-not-a-victimless-crime/