A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Okay… So correct me if I’m wrong, but being abused as a child is like… one of the biggest predictors of becoming a pedophile. So like… Should we preemptively go after these people? You know… To protect the kids?
How about single parents that expose their kids to strangers when dating. That’s a massive vector for kids to be exposed to child abuse.
What on earth? Just don’t sexualize children or normalize sexualizing children. Denying pedophiles access to pedophilic imagery is not some complex moral quandry.
Why on earth am I getting so much pushback on this point, on Beehaw of all places…
Okay… So correct me if I’m wrong, but being abused as a child is like… one of the biggest predictors of becoming a pedophile. So like… Should we preemptively go after these people? You know… To protect the kids?
How about single parents that expose their kids to strangers when dating. That’s a massive vector for kids to be exposed to child abuse.
What on earth? Just don’t sexualize children or normalize sexualizing children. Denying pedophiles access to pedophilic imagery is not some complex moral quandry.
Why on earth am I getting so much pushback on this point, on Beehaw of all places…
Wondering the same thing.
Because they’re computer generated images not children.