this post was submitted on 25 Jul 2023
13 points (100.0% liked)
Fediverse
3 readers
1 users here now
This magazine is dedicated to discussions on the federated social networking ecosystem, which includes decentralized and open-source social media platforms. Whether you are a user, developer, or simply interested in the concept of decentralized social media, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as the benefits and challenges of decentralized social media, new and existing federated platforms, and more. From the latest developments and trends to ethical considerations and the future of federated social media, this category covers a wide range of topics related to the Fediverse.
founded 2 years ago
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Putting the blame on Microsoft or IWF is meaningfully missing the point.
People were responsible for moderating what showed up on their forums or servers for years prior to these tools' existence, people have been doing the same since those tools existed. Neither the tool nor it's absence are responsible for child porn getting posted to Fediverse instances. If those shards won't take action against CSAM materials now - what good will the tool do? We can't run it here and have the tool go delete content from someone elses' box.
While those tools would make some enforcement significantly easier, the fact that enforcement isn't meaningfully occurring on all instances isn't something we can point at Microsoft and claim is their fault somehow.