Why Youtube’s algorithms push extreme content on every possible subject


Zeynep Tufekci was researching Trump videos on Youtube back in 2016 when she noticed something funny: Youtube began recommending and autoplaying increasingly extreme right-wing stuff — like white-supremacist Holocaust-denial videos.

So she did an interesting experiment: She set up another Youtube account and began watching videos for the main Democratic presidential contenders, Hillary Clinton and Bernie Sanders. The result? As Tufecki writes in the New York Times:

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is an incredibly interesting and subtle point: That the problems of Youtube’s recommender algorithms might be that they overdistil your preferences. Since they’re aiming for “engagement” — a word I am beginning to loathe with an unsettling level of emotion — the real problem with these algorithms is they’re constantly aiming to create an epic sense of drama and newness. At the tail-ends of this bimodal attentional landscape, only the Xtreme can survive. And of course, this precisely leverages our novelty-seeking psychology, which really does snap to attention when we’re presented with intense stuff.

So it’s not that Youtube radicalize politics specifically. It radicalizes everything, and politics just gets swept along in the slurry of zomg.

Read the rest of Tufecki’s piece — she’s one of the best critics of our algorithmicized world, and this one really nails it.

(CC-licensed image above via Pixabay)


Source Article from http://feedproxy.google.com/~r/blacklistednews/hKxa/~3/c573yrp1cw4/why-youtubes-algorithms-push-extreme-content-on-every-possible.html

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes