In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

  • limitedduck@awful.systems
    link
    fedilink
    arrow-up
    7
    arrow-down
    6
    ·
    1 year ago

    Nobody is protecting digital children and it’s almost always disingenuous when this argument is claimed to be made. The effort is to stop the normalization of the sexualization children. Lolicon is exclusively about romancing or sexualizing children. Deluded adults who think what happens in lolicon material is ok are potential risks to real children. Allowing such a risk to children for the pleasure of these adult is absurd.

    • Vendetta9076@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      Fair enough. Imo lolicon is disgusting. And Im not making an argument in bad faith, I just see how much general society fails at protecting children and would rather see any effort spent towards cracking down on lolicon to be used to help real children.

      • limitedduck@awful.systems
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I understand what you’re saying, but the fighting against Lolicon doesn’t necessarily take away from the fight against real CSAM. The reality is serious, far-reaching, and, ultimately, human issues like the exploitation of children are complex and require effort on multiple fronts to be effective.

      • limitedduck@awful.systems
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago
        1. The amount of people warped by COD or Lolicon is not 100%, but it’s certainly not 0%
        2. It sounds like you haven’t actually played COD because the game is about WARFARE, not domestic terrorism. Maybe ask people who joined the US military how inspired they were by the game