So Serrato started digging, looking for information on every Chemnitz-related video published on YouTube this year. What he found, according to a New York Times report, is that the platform’s recommendation system consistently directed people toward extremist videos on the riots — then on to far-right videos on other subjects. “Users searching for news on Chemnitz would be sent down a rabbit hole of misinformation and hate. And as interest in Chemnitz grew, it appears, YouTube funnelled many Germans to extremist pages, whose view-counts skyrocketed.”
Nobody who knows anything about YouTube will be surprised. Time and again, researchers have discovered that when videos with political or ideological content are uploaded to the platform, YouTube’s “recommender” algorithm will direct viewers to more extremist content after they have watched the first one. Given that most people probably have the autoplay feature left on by default, that means that watching YouTube videos often leads people to extremist sites.
So, how many iterations of the recommendation engine gets you from a Jezza speech to an insistence that we’ve got to starve 8 million Ukrainian kulaks? Is it more or less via Seumas Milne or Andrew Murray?