More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • xkforce@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    11 months ago

    If someone says it is raining outside, the newspaper’s job is to actually check whether it is raining outside NOT to say it both is and is not raining and let their “readers decide.”

    Should a newspaper not talk about something because some readers don’t agree with it ?

    You are arguing that newspapers should discuss NAZI ideals as if they are as valid as any other. No one decent agrees with you.

    • gian @lemmy.grys.it
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      11 months ago

      You are arguing that newspapers should discuss NAZI ideals as if they are as valid as any other. No one decent agrees with you.

      Nope, I am arguing that if something is not illegal it is not up to the platform to censor it.

      If that 200 authors asked a judge to command substack to remove the post, then good.

      If you decide that today is good that a platform censor something, (and I agree that nazis are not that nice thing to even consider to discuss) then tomorrow you cannot protest that a platform remove something that you consider good.

      Like Meta removing all the pro palestinian post/propaganda: is it acceptable that it is Meta to decide that even if it is not illegal?

      Free speech is absolute, and it include even what we hate.

      • xkforce@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Free speech IS NOT freedom from social consequences. And one of those social consequences is that people are allowed to tell you to take a hike.

        • gian @lemmy.grys.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          And I agree. But it seems that you still don’t understand how dangerous is to go after the platform instead of the authors of the messages.

          But let’s suppose that it is correct to go after the platform, so this time the offending content is removed. Fine, good thing.

          Next month 174 authors ask to remove everything about the right to have an abortion because they are offended by it and they think that it is wrong (and in some place it is even illegal), what do you think should happen?

          • xkforce@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            You are arguing as if people are wanting the government to intervene and that IS NOT what people are saying. People are allowed to not want an entire company to be a fascist bar.