22 Comments
Apr 24, 2022Liked by Noah Smith

As someone who was deep in youtube in the mid 2010s there was a huge wave of anti sjw postgamer gate content at the time. It was definitely a thing and it exposed me and many other young people to a sort of proto alt right view point. But it was only a few years. Between algorithm changes and hype cycle changes it’s not been that way for a long time now. That stuff is still out there but as the research you posted suggests, it doesn’t have the same viral potential as it did in the past

Expand full comment

I think people pretty dramatically underrate how good the YouTube recommendation algorithm is. You'd have to watch a lot of radical content relative to other content for a day or so to even START getting recommendations, and even then you might get one video per refresh. The simple reality is that in order to get a bunch of radical videos recommended you have to 1) seek out those videos deliberately or 2) basically never use YouTube and wander into them.

The idea that you'll get recommended radical videos for simply watching innocuous videos on channels that have radical videos is similarly wild. The YouTube algorithm is really good! When I watch compilations of Seinfeld clips it knows EXACTLY what I am looking for and not only does not recommend any other content a clip-making channel may make it doesn't even recommend Seinfeld content that is not clips of the show! And the algorithm gets better the more you use YouTube. I would say at any given time that 90% of the videos in my recommended are things I would enjoy? Maybe more?

Expand full comment
Apr 24, 2022Liked by Noah Smith

This seemed believable around 2016 (and it's one reason I never thought of working at YouTube despite their recruiters emailing me and calling me at work (not a humblebrag)), but I've heard people still use it for other subcultures on there - "vtuber rabbit hole" is a pretty common phrase.

And my own recommendations are extremely topical after only a little feedback to the algorithm; my front page is half rabbits half anime avatars. But it doesn't try to recommend me extreme polarizing rabbit content or anything.

In other news, apparently there's a left-NIMBY anime: https://www.animenewsnetwork.com/review/muteking/the-dancing-hero/12/.184624

Expand full comment
Apr 24, 2022Liked by Noah Smith

There might be a survival bias. In early 2000s I guess more computer users were progressive left than conservative right. Bet then in 2010s smartphones and computers are accessible to almost everyone. So maybe, it is not because the social network made more far right/conservative people, but because far right/conservative people got their smartphones and social network accounts. I am not sure if there is any research about this.

At least we know there were many far right people (Nazi German, etc) before WWII where there was no internet. TV and radio were the ones to be blamed during that period.

Expand full comment

You might not be able to prove that YouTube turns normal people into Nazis, but I don’t think one can deny that the mass proliferation of smartphones and the internet made left/right polarization worse. Extremist parties and candidates grew in popularity in basically every democratic country after 2010, and even some fairly undemocratic ones. I don’t see how you could explain that phenomenon, other than that those candidates were no longer suppressed and dismissed by mass media.

Expand full comment

You assumed that Kate Starbird was using media stories as her basis. She doesn’t say that. She says “family stories and self-reports”. She could be getting those directly in her own research. You’ve shown your own bias to your priors as well.

Expand full comment

Anecdotal, but I still feel like it heavily promotes Jordan P****son to me even though I clicked "not interested" on a string of his videos one day. I also watched like 5 Jesus Christ Superstar clips one day and my ads suddenly turned into crazy stuff. At least I no longer get those Ben Shapiro DESTROYS Liberal Student or whatever.

Expand full comment
Apr 24, 2022·edited Apr 24, 2022

Every Twitter user can now get exposure to a breadth of specialist expertise that most 1980s professional reporters would have killed for. Also, more death threats in a week than most reporters back then got in a decade.

Media is very, very different now. That really ought to be changing things. But how?

Are all the media changes buried and confounded by the rest of our modern weirdness? Has the real power of YouTube and Twitter been hidden by the even greater force of Trump and the pandemic?

I don't believe Twitter, Facebook, YouTube, etc did nothing. I expect they've changed a lot. Yet I can't convince myself to any confident idea about the change they've made.

Expand full comment

i lost my best friend to the youtube rabbit hole. i literally said "you've gone too far down the youtube rabbit hole" and we haven't spoken in a year.

Expand full comment

YouTube is constantly changing it's algorithm, not just in 2019, and the changes are not generally announced. Lots of changes happened around 2016-2017, too. I think the best you can say is we cannot really know if YouTube used to radicalize people. I also think your mental model of radicalization means YouTube can never radicalize people: you think if everyone sees enough videos, they'll become a terrorist. So if 100 people watch videos, and 2 become radicalized, that "proves" YouTube isn't the cause. Instead, a small number of people are vulnerable to radicalization, and exposure to it is what puts them over the edge. Those 2 wouldn't have been radicalized without the exposure. This is a very difficult area to make strong statements. In terms of censorship, that's not what YouTube and other social media need. What we need is for them to stop pushing harmful content to get clicks. It's like the difference between having Mein Kampf on a library shelf, and having librarians hand out copies of Mein Kampf as you enter the library, but only if you're a white male. The first is fine, the world has survived with that model for a long time. The second is new, and is a problem.

Expand full comment

I remember the D&D moral panic. Also the pedophilia stuff that was linked via satanism (perennial winner, pedophilia panic. Conservative conmen know their memes!).

I really fucking wanted to gun down some conservatives back then! And we were only getting the after shock in France, not the full blast. Thank God for our tight gun regulations... :)

Expand full comment

I must have a heavy degree of skepticism with the assertion that YouTube did not promote questionable content. I personally received recommendations for it regularly in the 2016's, although I admit it is a lot less nowadays.

I would regularly get recommendations to watch Jordan Peterson, or Ben Shapiro, or PragerU videos, despite my repeated insistence that I am not interested in this content. And also, whenever I watched any "meme culture" sort of video, there would also be progressively edgier and edgier shit in the recommendations, bordering on no longer even being jokes.

I don't think we "lost" a generation of men due to this effect, and I'm not even sure how many people take YouTube videos recommended to them as a serious source of information, but it definitely seemed to exist within the teenage male "meme culture" demographic, at least amongst myself and the people I hung out with.

Admittedly, I have not yet read through your linked studies so all I have to share is anecdotal evidence. But since you *also* shared anecdotal evidence I figure I might as well throw my story into the pot.

Expand full comment