713

Today I posted a picture of a stamp with an animal in it and they said the picture contained nudity and made me take it down, but I reported a photo of a guy with a fully visible swastika tattoo and they said that's fine.

I'd like to start a Lemmy community with photos of stuff that they refuse to remove called FacebookSaysItsFine.

you are viewing a single comment's thread
view the rest of the comments
[-] DJKJuicy@sh.itjust.works 23 points 1 year ago

I searched for a community named "FacebookSaysItsFine" and haven't yet seen anything.

Be the change you want to see in the world!

[-] BonesOfTheMoon@lemmy.world 10 points 1 year ago

I just worry people will post things like CSAM on it, but I think it would be very good.

[-] dustyData@lemmy.world 12 points 1 year ago

I would suggest the opposite. Perhaps a Facebook doesn't allow this, community. Too much risk of attracting trolls and monsters.

That said. The FBI says that Facebook, Instagram, Twitter et al. They all contain a not insignificant amount of CSAM at any given point in time. The fact just never gets reported by press because they're normalized platforms by the public. Only the fediverse gets that sort of negative attention in the press because it's the disruptive outsider platform. When by both proportion and volume, almost all other platforms have a worse issue with awful content that regularly flies under the radar because they are big corporations.

[-] ThePowerOfGeek@lemmy.world 8 points 1 year ago

It would definitely require some very active moderation and clearly-defined community rules. But it sounds like a great idea for a Lemmy community, if you have the time.

[-] thantik@lemmy.world 6 points 1 year ago

Cloudflare has free CSAM scanning tools available - they really just need to implement it.

[-] Rai@lemmy.dbzer0.com 1 points 1 year ago

When did “CP” become “CSAM”?

If you want to change the acronym, wouldn’t “CR” make more sense?

[-] Cracks_InTheWalls@sh.itjust.works 3 points 1 year ago* (last edited 1 year ago)

'cause porn is made with consenting adults. CSAM isn't porn. CR is typically what's depicted in CSAM (assuming that R stands for rape), but there's two (or more) separate though closely related crimes here. That and SA (sexual assault) covers a wider breadth of activities, which is good if a person wants to quibble over the term rape when regardless something horrific happened to a kid and videos/images of said horrific thing is now getting shared among pedophiles.

Will note I've only seen CSAM used when I started using Lemmy, so I'm not really sure when people started using the term over CP. I'm personally for it - it more accurately describes what it is, and while I haven't seen this term in the wild SAM to describe video or images of non-consenual sex acts among adults is good too.

[-] BonesOfTheMoon@lemmy.world 4 points 1 year ago

I modded for Reddit before, I could manage this.

[-] csgraves@lemmy.world 6 points 1 year ago

I worry about this on fediverse stuff. I made the mistake of looking at the links from a person who commented on anti trans legislation and let me just say yikes!

The link was to something trying to legitimize the identity of “map.”

NOPE.

I deleted my comments and blocked the sick bastard.

[-] BonesOfTheMoon@lemmy.world 7 points 1 year ago

I blocked some instance that was all porn and that seemed to improve my experience. I'm not against porn, I just don't care for it myself.

[-] Sylver@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

Try to stay apolitical and you won’t attract those trolls as early. Which I now realize may be difficult, considering many of the posts would be calling out Nazi scum…

this post was submitted on 07 Sep 2023
713 points (95.6% liked)

[Outdated, please look at pinned post] Casual Conversation

6470 readers
1 users here now

Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.


RULES

Related discussion-focused communities

founded 1 year ago
MODERATORS