- Experts report AI image generators being used to create and share child sexual abuse material (CSAM).
- Open-source image generators, like Stability AI’s Stable Diffusion model, are being exploited due to easily dismantled safety precautions.
- Existing systems struggle to detect these new AI-generated images, making it harder for law enforcement to help victims.