‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • TrickDacy@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Am I violating privacy by picturing women naked?

    Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

    I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

    Can you actually stop clutching pearls for a moment to think this through a little better?

    • originalfrozenbanana@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      11
      ·
      1 year ago

      Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

      Your comment is a self report.