this post was submitted on 17 Feb 2024
1088 points (98.7% liked)

Technology

59605 readers
3309 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pixxelkick@lemmy.world 75 points 9 months ago* (last edited 9 months ago) (8 children)
  1. Called this awhile back, this is why Reddit has such a high evaluation.

  2. Poisoning your data won't do anything but give them more data, do you seriously think reddit servers don't track every edit you make to posts? You'd literally just be providing training data of original human vs poisoned. They'd still have your original post, and they have a copy of everytime you edit it.

  3. Whoever buys reddit will have sole access to one of the larger (I don't think largest though) pools of text training Data on the internet, with full licensed usage of it. I expect someone like Google, FB, MS, OpenAI, etc would pay big $$$ for that.

"But can't people already scrape it?"

  1. Well yes, but it's at best legally dubious in some places

  2. Scraping Data off reddit only gets you current versions of posts (which means you can get poisoned dara, and cant see deleted content), and is extremely slow... if you own the server you have first class access to all posts in a database, including g the originals and diffs of everytime soneone edited a post, and all the deleted posts too.

Think about if you perhaps wanted to train an AI to detect posts that require flagging for moderation, if you scrape reddit data, you can't find deleted posts that got moderated...

But, if you have the raw original data, you 100% would have a list of every post that got deleted by mods and even the mod message on why it was deleted

You surely can see the value of such data, that only owners of reddit are currently privy to atm...

[–] DAMunzy@lemmy.dbzer0.com 19 points 9 months ago (2 children)

Poison it by randomly posting copywrited materials by big corps like Disney?

[–] RGB3x3@lemmy.world 10 points 9 months ago

Bee Movie script. Millions of times

[–] Isoprenoid@programming.dev 9 points 9 months ago

Once again the day is saved by piracy.🏴‍☠️

[–] Buddahriffic@lemmy.world 16 points 9 months ago (3 children)

They've also got vote counts and breakdowns of who is making those votes. This data will be worth more for AI training than any similar volume of data other than maybe the contents of Wikipedia. Assuming they didn't have it set up to delete the vote breakdowns when they archived threads.

Why are those breakdowns worth so much? Because they can be used to build profiles on each voter (including those who only had lurker accounts to vote with), so they can build AIs that know how to speak with the MAGA cult, Republicans who aren't MAGA, liberals, moderates, centrists, socialists, communists, anarchists. Not only that, they'll be able to look at how sentiments about various things changed over time with each of these groups, watch people move from one to another as their opinions evolved, see how someone pretends to be a member of whatever group (assuming they voted honestly and posted under their fake persona).

Oh and also, all of that data is available through the fediverse but it's free to train on to anyone who sets up a server. Which makes me question whether the fediverse is a good thing because even changing federation to opt-in instead of opt-out just covers whether your server accepts data from another. It's always shared.

Open and private are on opposite sides of a spectrum. You can't have both, best you can do is settle for something in the middle.

[–] pixxelkick@lemmy.world 9 points 9 months ago

Which makes me question whether the fediverse is a good thing

I'd argue it's good, because it means open source AI has a fighting chance with FOSS data to train on without needing to fork over a morbillion dollars to Reddits owners.

Whatever use cases the reddit data can train on, FOSS researchers can repeat it on Lemmy data and release free models that average joes can use on their own without having to subscribe to shit like Microsoft Copilot and friends to stay relevant.

[–] Breezy@lemmy.world 3 points 9 months ago (2 children)

What if reddit also kept all deleted comments and post, im sure there are shit loads of things people type out just to delete, thinking all the while it'll never see the light of day.

[–] Buddahriffic@lemmy.world 5 points 9 months ago (1 children)

I'd be surprised if they don't keep all of that. There were a number of sites for looking at deleted posts. They'd just go and grab everything and compare what was still there with what wasn't and highlight the stuff that wasn't there anymore.

Which is also possible here, though the mod log reduces the need for it. But if someone is looking for posts people change their mind about wanting anyone to see, deleting it highlights it instead of hides it for anyone who is watching for that.

[–] Breezy@lemmy.world 4 points 9 months ago* (last edited 9 months ago) (2 children)

I think that site was unddit, but yes those were posted then later deleted. Im talking about just typing out a post or comment and never posting just simply backing out of the page or hitting cancel. Im not just if any of that is stored on the site or just locally.

[–] sacredfire@programming.dev 4 points 9 months ago

You would be able to tell by monitoring the network tab of the browser developer tools. If post requests are being made (which they probably are, though I’m too lazy to go check) while you are typing a comment, they are most likely saving work in progress records for comments.

[–] Buddahriffic@lemmy.world 3 points 9 months ago

Oh, yeah, I've wondered the same myself. Hell, that might have been a motivation for removing the API access.

[–] pixxelkick@lemmy.world 4 points 9 months ago (1 children)

They definitely do, it's common for such systems to never actually delete anything because storage is cheap. It likely just is flagged deleted=true and the searches just return WHERE [post].Deleted = False on queries on the backend.

So it looks deleted to the consumer, but it's all saved and squirreled away on the backend.

It's good to keep all this shit for both legal reasons (if someone posts illegal stuff then deletes it, you still can give it to the feds), as well as auditing (mods can't just delete stuff to cover it up, the original still exists and admins can see it)

[–] archomrade@midwest.social 1 points 9 months ago

This is how system storage works generally: the disk "de-lists" the data in the block registry, so it appears there is no data in that block.

Obviously a server back end it keeping it for redundancy and not efficiency, but procedurally it's the same

[–] archomrade@midwest.social 3 points 9 months ago

The problem (for most) was never that people's public posts/comments were being used for AI training, it was that someone else was claiming ownership over them and being paid for access, and the resulting AI was privately owned. The fediverse was always about avoiding the pitfalls of private ownership, not privacy.

It's exhausting constantly being "that guy," but it really needs to be said constantly; private ownership is at the core of nearly every major issue in the 21st century.

The same goes for piracy and copyright. The same goes for DMCA circumvention and format shifting content you own. The same goes for proprietary tech ecosystems and walled gardens. Private ownership is at the core of the most contentious practices in the 21st century, and if we don't address it shit like this will just keep happening.

[–] Dettweiler42@lemm.ee 7 points 9 months ago (1 children)

In regards to the editing part, sure, I'm sure they can track your edit history. However, on a large scale, most edits are going to be to correct things. To determine if an edit was to poison the text, it would likely require manual review and flagging. There's no way they're going to sift through all of the edits on individual accounts to determine this, so it's still worthwhile to do.

[–] T156@lemmy.world 3 points 9 months ago (1 children)

Although they could sidestep the issue a bit by simply comparing the changes between edits. Huge changes could just be discarded, while minor ones are fine.

[–] bbkpr@lemmy.world 1 points 9 months ago

You could easily make a minor change that negates every single other fact.

[–] Milk_Sheikh@lemm.ee 7 points 9 months ago (2 children)

sigh

So the old trick of “search term +reddit” no longer will work then huh?

I’ve already made a habit of adding date limiters to web results from before before LLMs were made public… The SEO ‘optimization’ game of before was bearable, but the LLM spam just ruins so many search results with regurgitated garbage or teaspoon deep information

[–] Nelots@lemm.ee 7 points 9 months ago* (last edited 9 months ago) (1 children)

search term +reddit

tossing site:reddit.com before any search will guarantee all results come from reddit, if that's what you're looking for.

[–] Milk_Sheikh@lemm.ee 3 points 9 months ago

Ahhh my bad, that’s what I meant

[–] Dettweiler42@lemm.ee 4 points 9 months ago

During the peak of the great purge, it was quickly becoming pointless. A lot of results were bringing up deleted posts. It took a while for search engines to catch up and start filtering a lot of those results out.

[–] manuallybreathing@lemmy.ml 2 points 9 months ago

request your reddit data and they deliver you every comment you ever made

[–] afraid_of_zombies@lemmy.world 1 points 9 months ago (1 children)

Sounds like something a bunch of governments would be interested in. As you pointed out you get to see why human mods made certain decisions. Could you an edge in manipulation.

[–] SpaceCowboy@lemmy.ca 1 points 9 months ago

Ehh... I think manipulating people on the internet is so easy they don't need to dig down to that level.

Though for security reasons things like "we should blow up the government" that the person later deleted probably are tracked.

[–] Falcon@lemmy.world 1 points 9 months ago

With respect to 2, it would stop others scrapping the content to train more open models on. This would essentially give Reddit exclusive access to the training data.