“New Study Suggests That YouTube’s Recommendation Algorithm Isn’t the Tool of Radicalization Many People Believe (At Least Not Anymore)”

“Basically, YouTube is pushing people towards mainstream media sources. Whether or not you think that’s a good thing is up to you. But at the very least, it doesn’t appear to default to extremism as many people note. Of course, that doesn’t mean that it’s that way for everyone. Indeed, there are some people criticizing this study because it only studies non-logged in user recommendations. Nor does it mean that it wasn’t like that in the past. This study was done recently, and it’s been said that YouTube has been trying to adjust its algorithms quite a bit over the past few years in response to some of these criticisms.”

Read the full article here: https://www.techdirt.com/articles/20191228/22492643646/new-study-suggests-that-youtubes-recommendation-algorithm-isnt-tool-radicalization-many-people-believe-least-not-any-more.shtml

The article also raises an interesting point about our ability to think for ourselves. I wonder how many people watching YouTube’s recommended videos (even if they did/do tend towards radicalization) would actually find themselves being heavily persuaded by the content?


Comments

There are 2 thoughts on ““New Study Suggests That YouTube’s Recommendation Algorithm Isn’t the Tool of Radicalization Many People Believe (At Least Not Anymore)”

    1. Fiona McLean says:

      This is a good point. I noticed that the research was only concerned with non-logged in users in the article that I referenced above. I read your blog post – a concise summary of the issues surrounding the research – and would tend to agree with your conclusion. I do think their conclusions were hyperbolic but nonetheless an interesting frame of reference in relation to new users.

Leave a Reply