• 0 Posts
  • 85 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle












  • I believe that images are important to investigation as they help with the identity of those children being abused. When that’s mixed in with a bunch of AI pedophile stuff it serves to obfuscate that avenue of investigation and hampers those efforts, which are 100% more important than anyone’s need to get off to pedophilic AI imagery.

    Online investigation in general has been a successful avenue in the recent past.

    If there was a chance of saving even one child but it meant that no one could see AI images of sexualized children then those would be completely acceptable terms to me.

    I would hold there’s zero downside to outlawing the production of AI CSAM. There’s no indication that letting pedophiles indulge in “safe” forms of pedophilic activity stops them from abusing children. It’s not a form of speech or expression with any value. If we as a society are going to say we’re against abuse of children then that needs to include being against the cultivation and networking of abusive culture and people. I see no real slippery slope in this regard.




  • It is better than right-wing lunatics, but this will likely result in smaller progressive channels like Rational National, Majority Report, Humanist Report etc. being sidelined even more. And these are not conspiracy channels, they provide really important context that’s often left out by “authoritative” media like CNN and MSNBC.

    On its face it sounds like a good idea, but I just don’t trust Google to be the one to decide what’s authoritative and what needs to be weighted down, We’ve already learned that they have perverse incentives in that regard with the way they’ve remorselessly promoted extreme right-wing content for years because it made them a quick buck. For them to have a sudden change of heart and start talking about getting the facts straight is a bit sus to me.