Linked – Free Speech Is Not the Same As Free Reach
This is really the problem with social media algorithms, not outright censorship:
TO SEE HOW this algorithm amplification works, simply look to RT, or Russia Today, a Russian state-owned propaganda outlet that’s also among the most popular YouTube presences. RT has amassed more than 6 billion views across 22 channels, more than MSNBC and Fox News combined. According to YouTube chief product officer Neal Mohan, 70 percent of views on YouTube are from recommendations—so the site’s algorithms are largely responsible for amplifying RT’s propaganda hundreds of millions of times.
How? Most RT viewers don’t set out in search of Russian propaganda. The videos that rack up the views are RT’s clickbait-y, gateway content: videos of towering tsunamis, meteors striking buildings, shark attacks, amusement park accidents, some that are years old but have comments from within an hour ago. This disaster porn is highly engaging; the videos have been viewed tens of millions of times and are likely watched until the end. As a result, YouTube’s algorithm likely believes other RT content is worth suggesting to the viewers of that content—and so, quickly, an American YouTube user looking for news finds themselves watching Russia’s take on Hillary Clinton, immigration, and current events. These videos are served up in autoplay playlists alongside content from legitimate news organizations, giving RT itself increased legitimacy by association.
I’ve witnessed something similar to this myself. Once upon a time, when a Facebook connection would share a false story that I had already seen on Snopes, I would try and helpfully post the link in the comments. I mean if you’re going to start a political argument on social media at least start it with some context, right? But the algorithm started seeing my comments on those posts as proof I wanted to see more political stuff, which I definitely do not.
So I had to simply ignore total falsehoods and judge people silently in order for Facebook to learn that I do not actually want a newsfeed full of politics and copy paste scams. That’s a real shortcoming of the algorithms, it isn’t advanced enough to get context, it just tries to give me stuff from the same places I’ve interacted with.
Consider that when you see what YouTube and others are “suggesting” for you. Is it suggesting it because the subject matter is something you actually care about, or is it suggesting it because you’ve clicked on one too many click-bait articles put out by propaganda machines?
https://www.wired.com/story/free-speech-is-not-the-same-as-free-reach/
Follow these topics: Links, SocialNetworking