Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school “believed” the deepfake nudes were deleted.

  • Fades@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    1 year ago

    So does yearbook and any other kind of photos that depict children for that matter

    You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

    These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

    Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

    What a pathetic brain dead stance you are defending

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Abuse and bullying of their classmates is just ‘discovering themselves’? Discovering that they’re psychopathic little mysoginists I guess. Their ‘spanking material’ was created in order to demean and huumiliate their victims. There’s plenty of porn online and absolutely no need for them to do this. If you actuslly wanted to help the victims you would not be trivialising and excusing this behaviour as ‘being horny about classmates’.

      • wildginger@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        And an AI image with a face photoshopped over it isnt a photo of a child.

        And a teen being sexually interested in other teens isnt a pedophile.