A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.
Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.