this post was submitted on 08 Dec 2024
211 points (96.9% liked)
Technology
60052 readers
3601 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.
(If you have not seen and/or read up on this, save yourself the misery and skip it: it's gross.)
It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.
Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it's fucking awful, but if you've got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.
Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn't explicit CSAM.
Just... gross, and poorly handled.