I’m considering migrating away from Lemmy.world on account of downtime and the recent blocking of piracy communities. However, I am quite fond of Lemmy.world’s other choices in defederation, e.g. nazi nonsense and CSAM.
Is there an easy method of comparing these blockings between instances to better choose where to migrate to?
First check the software version. If they’re at least fairly up to date, check the instances page. The blocks are in the bottom half.
Instances running lemmy-ui normally publish their list of federated and blocked: https://lemmy.ml/instances
There might be some scripts or websites out there comparing them.
https://fba.ryona.agency/ is one website that can help, top search box searches for instances that have defederated from the one you entered, the bottom one lists the instances the one you entered defederated from.
It only shows full instance level defederation, not blocked communities though
💀
piracy, steamdeckpirates and some other one from dbzer0.
Here’s the relevant thread:
I don’t know instead of getting through the instances and see individually. However: Most big instances i’ve had a look at and i see regularly in the usernames block exactly this. I think it’ll be rather easy to find one that matches those criteria.
I don’t think anyone has created a resource that easily shows exactly what you are looking for - so you’re probably going to have to do your own research manually.
If you haven’t found it yet, this github project tracks tracks most known lemmy instances and tracks some high level metrics for them. Pay attention to the “BI” (Blocked Instances) column which shows how many instances they are blocking as well as the “BB” (Blocked By) column which shows how many other instances block them. It’s worth pointing out that the “Users” column represents active monthly users and not total users.
Another question: Is there really CSAM on Lemmy? I can not believe that and haven’t seen it either. (If this means what i think it does, i’m glad. I’ve been in all sorts of corners of Lemmy. And so far everything seemed alright except for some nuts that have radical political views.)
From what I’ve heard, at least. I know there’s unfortunately quite a lot of loli stuff based on the defederation lists I’ve seen. I’m happy to report that I’ve never had to see either here first-hand.
OP is claiming that they agree with lemmy world’s defederation choices driven by CSAM, which is unquestionably nonsense. Lemmy world admins have made several in depth posts explaining defederation decisions and none of them had anything to do with CSAM. In some jurisdictions, it would likely be illegal to give such an explanation as it would amount to creating a pointer to a source of CSAM that hasn’t yet been taken down. By and large, these things are reported directly to law enforcement and cleaned up quietly, without showing up in modlogs… and in many jurisdictions the law REQUIRES handling CSAM in precisely that fashion in order to prevent it from being archived before it’s taken down.
Is there a non-zero amount of CSAM in the Fediverse? Sadly yes. Once you achieve a certain scale, people do all the things… even the bad ones. This research paper (from Stanford, it’s reputable and doesn’t include or link to CSAM) discusses finding, in a sample of 320k Mastodon posts, over 100 verified samples of CSAM and something like 1k-3k likely adjacent posts (for example that use associated keywords). It’s pretty likely that somewhere on Lemmy there are a non-zero number of such posts, unfortunately. But moderators of all major instances are committed to taking appropriate steps to respond and prevent reoccurrence.
Additionally, blahaj.zone defederated from lemmynsfw over the adorableporn community. The lemmynsfw admins take reports of CSAM very seriously, and the blahaj admins stopped short of accusing them of hosting actual CSAM. But they claimed that models of verified age “looked too young” and that the community was courting pederasts. These claims were largely baseless, but there was a scuffle and some of the secondary and tertiary discussion threw around terms like CSAM loosely and incorrectly.
I think OP is probably hearing echoes of these kinds of discussions 3rd hand and just not paying attention to details. There’s certainly no well-known and widely federated CSAM communities, and all responsible admins would take immediate action if anything like that was found. CSAM doesn’t factor into public federation decisions, because sources of CSAM can’t be discussed publicly. Responding to it is part of moderation at scale though, and somewhere some lemmy admin has probably had to do so.
Idk. That ‘study’/article fails to recognize the consequences and ethics of the legal situation in Japan for example. And I think everything is a bit too vague to really claim to be scientific.
This topic sometimes makes me a bit angry/disappointed. On the one hand I perfer my favorite places on the internet (eg the fediverse)/not to be used for disgusting stuff and crime. On the other hand politicians like Ursula von der Leyen, who is now head of the EU parliament, have been using exactly this subject for years (and in my eyes thus abusing the stories of the victims yet again) to advertise for 100% online-surveillance, getting rid of end to end encryption and storing massive amounts of data about everyone, just in case…
“Just think about the children…”
And this is just not the way to solve that issue. I don’t want to live in their 1984-society fantasies and there are better solutions around.
A second thing I find kind of alarming. The article mentiones those automatic content detection tools by Google and Microsoft. They are NOT available for the free world. I think if legislature really forces us to filter on upload… And it’s only big corporations that own the databases of CSAM… This is their way to easily get rid of the fediverse. And every platform build by and for the people.
I’m a bit disgusted. But this is why i’m interested in the subject.