TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • jimmy90@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    13 hours ago

    yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately

    these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account

    this is just how “the algorithm” works. shovel more of what you watch in your face.

    the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern

    • prole@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 hours ago

      Man that seems like a lot of work just to preserve a shitty logarithm that clearly isn’t working for you… Just get a third party app and watch without logging in

      • jimmy90@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        oddly enough it seems to be working, if i don’t login at all youtube just offers up the usual dross