- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”
The problem being we basically know that’s not how it works.
If you push them underground, the main result is fewer Nazis. Intentionally platforming them helps them maintain a facade of normalcy that makes it WAY easier to recruit people into the organizations and further radicalize them. Not to mention the simple amplification effect of having a platform.
The idea that the underground Nazis are going to be a more distilled, pure, volatile form of Nazi SOUNDS theoretically sensible. But if that’s your argument, the burden of proof is on you to demonstrate it actually happens. And even if it sometimes does, if there’s only 10 of them it barely matters.
The simplest solution, to shut down the recruitment pipeline, is also the correct choice for a platform operator to make.
Thanks for putting it into words. I couldn’t quite put my finger on what specifically felt wrong about this reasoning but you’re on point.