I was recently thinking about how amazing it is that with this decentralized community we would have no censorship from big corporations and then I asked myself: what about illegal content? The kind of content that really should not be shared? As an example, what if someone creates a Lemmy instance and starts creating a community around CP? Or human trafficking? How do we deal with it? I know that instances can choose with whom they can access the content, so if most popular instances blacklist that “illegal” instance its content wouldn’t be easily visible, but it would still be in the Fediverse. Also, will all popular instances have to be quick to blacklist these “illegal” instances? Isn’t that a little to difficult? If we go the other way, where they create a whitelist, wouldn’t that harm small legit instances? Is there a plan to fight off illegal content?
If it’s still on the internet you report them to law enforcement. But I’d bet that those intent on hosting those kinds of materials would have already started their own instances on the dark web and there sadly isn’t much we can do in that case. Only way to deal with them seems to be law enforcement’s approach of trapping those predators by posing as clients but then again, that’s their job, not yours. What you can do is 1- defederate, 2- warn other instance admins and 3- report to the police.
As usual, it’s the same as email. There will need to be various sorts of spam filtering developed in order to keep the platform usable. In the meantime — if you see it, report it and delete it.
Suppose you open up your email and you see that you’ve received a piece of spam that contains CSAM (CP). You have not committed a crime — but you also mustn’t keep it. So you report the spam to your email provider, and you delete it from your mailbox. If you’re very diligent maybe you report it to NCMEC.
Suppose you run an email server. You’re aware of the existence of spam (alas!) and you do your best to block spam using various technologies ranging from DNSBLs to ML classifiers. If someone on the Internet sends spam containing CSAM to a user on your server, you didn’t send it; they did. The sender committed a crime. Your spam filters just didn’t catch that particular instance. So when your user reports it to you, you improve your spam filters. And you delete it.
Suppose you run an email server. Your spam filters might include a reputation score for other email servers. When your filters notice that a large fraction of the messages from a particular server are spam, they switch to automatically block all mail from that server. Then even if that server tries to send spam to your users, the offending messages never even hit your server’s disk.
Expect that as this platform matures, it will need many of the same sorts of spam-mitigation technology that email and other federated services have used.
I’m repeating “and you delete it” once again because that’s important. You mustn’t retain copies of illegal files even as training data for your spam classifiers. The big email providers & social media companies go to a bunch of effort to keep data about CSAM files, without having to keep the actual files.
Same with the Internet. There aren’t any real open human trafficking or child porn sites online. Most hosting companies won’t host them, most protection companies won’t hold off ddos attacks and if they tried to do self hosting, their jurisdiction may blast down their doors and raid them. If somehow all of that doesn’t happen, countries will block them anyway and if not that, then communities can defederate from them.
Stuff like Discord is more of a risk as it is bigger so less moderation resources and it’s private
deleted by creator
Legality also totally subjective, it’s a big gray area for sure
Yeah, I totally understand that. As a personal example: I’m against its civil usage of firearms, if someone is using the Fediverse to sell them, who am I to say that that is illegal? It might be illegal where I live, but maybe it is legal where they live, we can’t really be judges on these kind of topics. I used the term “illegal” because I couldn’t find a better term to describe those kind of subjects that (hopefully) 99.99% of people would totally NOT be okay with it showing up on their homepage, like the two examples I provided. What is the plan for that?
Would they be willing to sell them to you, though? I keep getting local drug dealers trying to interact with me on Instagram. I don’t see how it would be worse on the Fediverse. Even some reddit subs have drug dealers advertising
Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.
You should focus on making sure that your instance’s communities stay within the legality of your country, as well as flag and deal with any illegal behavior by your users.
For anything remote you have two options:
-
Block the whole instance (not recommended unless it’s clear that the whole instance is dedicated to something that’s illegal in your country or if they host something incredibly disturbing)
-
Click on “Remove” on the main page of a remote community. This will remove the remote community and make it inaccessible to local users but keep you federated to their instance.
You cannot control what other servers do. There will be servers out there hosting illegal stuff. But that’s not something you or I need to fix, that’s where law enforcement needs to be involved. The only thing you can do is block. If it’s something serious like CP or human traficking, grab any logs you might have from them, report to authorities, purge content from database and block instance.
Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.
True for Lemmy. False for kbin and Mastodon which create local copies.
deleted by creator
Dear /kbin admins and users:
Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.
True for Lemmy, false for /kbin. Example meme post from lemmy.ml - the image has been fetched and is present on kbin.social’s database.
-