this post was submitted on 23 Jan 2025
1132 points (97.2% liked)

Technology

61227 readers
4291 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] Subtracty@lemmy.world 9 points 1 week ago

With Milo (miniminuteman) in the thumbnail, I thought the video was going to imsinuate that his content was part of the alt-right stuff. Was confused and terrified. Happily, that was not the case.

[–] LandedGentry@lemmy.zip 9 points 1 week ago

This is basically the central thesis of The Social Dilemma.

[–] TankovayaDiviziya@lemmy.world 9 points 1 week ago (1 children)

Yeah, I've gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.

[–] Corkyskog@sh.itjust.works 5 points 1 week ago

The minute I sign out of my account YouTube tries to radicalize me.

[–] Hope@lemmy.world 8 points 1 week ago* (last edited 1 week ago) (1 children)

Just scrolling through shorts on a given day, I'm usually recommended at least one short by someone infamously hostile to the LGBTQIA+ community. I get that it could be from people with my interests hate-watching, but I don't want to be shown any of it. (Nearly all of my YouTube subscriptions are to LGBTQIA+ creators. I'm not subscribed to anyone who has ever even mentioned support for right leaning policies on anything.)

load more comments (1 replies)
[–] Cracks_InTheWalls@sh.itjust.works 7 points 1 week ago* (last edited 1 week ago) (1 children)

Real talk: I've been using YouTube without an account and with some ad blocking stuff installed. Based on what I'm seeing, I'm pretty sure the algorithm's datapoint for me is "He was born with a penis and is ok with that."

When I lose my better judgement and start scrolling shorts like an idiot, It is fight videos (IRL, movie scenes, UFC and boxing), auditing, Charlie Kirk and right-wing influencers, and the occasional clip from Shoresy on the basis "He might be Canadian too, idk".

It is noticibly weird, and I have brought it up to my kid who uses an account, is not what Youtube believes me to be, and whose shorts feed is very different.

We do both get that guy who opens Pokemon cards with a catchy jingle, though.

I check friends' Snapchat stories from time to time, and Snapchat suggests public stories on the same page. I think Snapchat has the same sort of singular data point on me that "this account is likely a straight man", because most of what they show me are sports clips, woman influencers in revealing clothing, and right-wing influencers talking about culture war stuff. I never view any of that sort of stuff, but it still shows up any time I try to check my friend's stories. I guess I view public stories so infrequently that they just give me a default generic man feed.

[–] neon_nova@lemmy.dbzer0.com 5 points 1 week ago

If they are using a vpn to switch their location, could it be that people in the south are using the same ip to access phub and rightwing crap?

[–] MITM0@lemmy.world 4 points 1 week ago (1 children)

So....... in the US then ?

[–] TheGrandNagus@lemmy.world 9 points 1 week ago (1 children)

It's 100% not just the US where the algorithm favours this stuff.

[–] UraniumBlazer@lemm.ee 4 points 1 week ago

Agreed 100%. Whenever I'm in India, I get Hindu nationalist content A LOT. I briefly attempted the dislike/don't recommend thing, but nope! I was getting absolutely spammed with stuff like this regardless. I just disabled shorts after that.

[–] Imhotep@lemmy.world 4 points 1 week ago (1 children)

fresh YouTube account

change their location to a random city in the US

yeah but you're still bound to IP addresses. I was under the impression Youtube used those for their profiling

[–] otterpop@lemmy.world 9 points 1 week ago (2 children)

Probably used a VPN I'd imagine, but I haven't watched the video

[–] ObsidianNebula@sh.itjust.works 3 points 1 week ago (1 children)

You are correct. He used a VPN for several US locations in the video. He then compared what content was shown in different regions of the US to see if everyone sees the same thing or if the content is radically different depending on where you are.

load more comments (1 replies)
[–] Imhotep@lemmy.world 2 points 1 week ago

most likely yes.

My point is, if youtube customizes the feeds based on the IPs, then the youtube accounts used are not really "fresh" but there's already some data entered into the profiles upon their creation.

load more comments
view more: ‹ prev next ›