this post was submitted on 09 Jul 2023
2215 points (97.5% liked)

Fediverse

17776 readers
48 users here now

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of "federation" and "universe".

Getting started on Fediverse;

founded 5 years ago
MODERATORS
 

The best part of the fediverse is that anyone can run their own server. The downside of this is that anyone can easily create hordes of fake accounts, as I will now demonstrate.

Fighting fake accounts is hard and most implementations do not currently have an effective way of filtering out fake accounts. I'm sure that the developers will step in if this becomes a bigger problem. Until then, remember that votes are just a number.

you are viewing a single comment's thread
view the rest of the comments
[–] sparr@lemmy.world 89 points 1 year ago (6 children)

Web of trust is the solution. Show me vote totals that only count people I trust, 90% of people they trust, 81% of people they trust, etc. (0.9 multiplier should be configurable if possible!)

[–] interdimensionalmeme@lemmy.ml 11 points 1 year ago

Your client has to compute the raw data, not the server or else it will just be your server manipulating what you see and think.

[–] nekat_emanresu@lemmy.ml 4 points 1 year ago* (last edited 1 year ago)

Love that type of solution.

I've been thinking about an admin that votes on example posts to define the policy, and then getting users scored against it, then using high scorers to represent user copies of the admins spirit of moderation, and then make systems that use that for automoderation.

e.g. I vote yes, no, yes. I then run the script that checks my users that have voted in all three, and the ones with the highest matching votes that i define(must be 100% matching to my votes) gets counted as "matching my spirit of moderation". If a spirit of moderation user downvotes or reports then it can be auto flagged into an admin console for me to then rapidly view instead of sifting through user complaints, and if things get critically spicy i can promote them to emergency mods, or automate their reports so that if a spirit user and a random user both report, it gets auto removed.

[–] interdimensionalmeme@lemmy.ml 2 points 1 year ago

For each vote, read user post content and vote history and age

This should happen in the client and easily controllable by the user. As well as to investigate why one particular post or current was selected by the local content discovery algorithm. So you can quickly find fraudulent accounts and block them.

And this public, user led moderation actions then go on to inform the content discovery algorithm of other users until we have consensus user led content discovery and moderation.

And just like that we eliminate the need for shadowy humans of the moderator priesthood to play human spamfilter / human thought manipulator

[–] rDrDr@lemmy.world 1 points 1 year ago

This was a great feature of reddit enhancement suite.