Facebok Screen

Why We Can’t Expect Facebook to Fact Check Our Feeds

I’ve held this opinion for awhile, that no matter what Facebook does, there’s no way it’s going to be able to deal with the scope of fake news, harassment, etc. that rolls through their platform every day. (Or any other social media company.) Anything they do simply won’t scale.

Let’s take fact-checking, for example:

What do you think of when you hear that Facebook is partnering with third-party fact checkers to validate the truth of news stories shared on the platform? I can imagine massive rooms filled with people checking the details of news links as they appear on the site, and tagging things as fake news as they go along. Or maybe an algorithm that does some initial flagging of things to send to the massive team of fact-checkers.

Well, this article make clear that is not at all what really happens.

Considering nearly a third of Australians get their news from Facebook, these fact checkers play an outsized role in determining what gets seen, and in what context, for the country’s 17 million users.

So just how many people are working on figuring out what’s real or not on Facebook in Australia? Seven.

Between them, they’ve completed 220 fact checks since April 2019 — about one check every one and a half days on average.

I don’t need to tell you that 17 million users share a hell of a lot more than 220 posts in one day, let alone 9 months.

Think it’s better in the US? Not really.

American news outlet The Hill reported in January 2020 that the US has six fact checking partners, with 26 full-time staff checking a total of 200 posts per month. Each partner has said it plans on expanding in the lead-up to the 2020 US presidential election. Facebook reported 190 million active users in the US and Canada at the end of 2019.

Facebook is fighting a losing battle here. There aren’t enough fact-checkers out there that they can hire to even make a dent in how much crap gets shared on their platform. The simple attempt to verify the facts of any articles or posts that get shared on their network can never scale enough to match the vast amount of things being shared by users, and things being targeted at certain groups and users.

There’s just no way. With that many users, it’s not feasible for any company to moderate what gets shared in any sort of effective way.

So how do we respond?

One, stop using social media as your sole news source.

Two, be the change you want to see on social media. Stop reacting, sharing, “liking” things that fit your world view, but have no actual basis in reality. Pause and think about what you’re sharing. Go and actually read the link and make sure the story matches the headline, or isn’t coming from a obvious fake news/satire site. Think about why you’re seeing something, who is directing it at you, and whether you trust them or not.

Just stop and think instead of acting out of emotion, especially if that emotion is outrage. These kinds of trolls target that emotion more than anything, because they know that’s our weakness. When outraged, our ability to think critically goes out the window. That’s how mobs form, online and off.

Mobs rarely keep all their facts straight.

Don’t be part of a mob. Don’t expect social media companies to fact check for you. Take some responsibility for your own little part of the internet.

Similar Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.