Facebok Screen

Why We Can’t Expect Facebook to Fact Check Our Feeds

I’ve held this opinion for awhile, that no matter what Facebook does, there’s no way it’s going to be able to deal with the scope of fake news, harassment, etc. that rolls through their platform every day. (Or any other social media company.) Anything they do simply won’t scale.

Let’s take fact-checking, for example:

What do you think of when you hear that Facebook is partnering with third-party fact checkers to validate the truth of news stories shared on the platform? I can imagine massive rooms filled with people checking the details of news links as they appear on the site, and tagging things as fake news as they go along. Or maybe an algorithm that does some initial flagging of things to send to the massive team of fact-checkers.

Well, this article make clear that is not at all what really happens.

Considering nearly a third of Australians get their news from Facebook, these fact checkers play an outsized role in determining what gets seen, and in what context, for the country’s 17 million users.

So just how many people are working on figuring out what’s real or not on Facebook in Australia? Seven.

Between them, they’ve completed 220 fact checks since April 2019 — about one check every one and a half days on average.

I don’t need to tell you that 17 million users share a hell of a lot more than 220 posts in one day, let alone 9 months.

Think it’s better in the US? Not really.

American news outlet The Hill reported in January 2020 that the US has six fact checking partners, with 26 full-time staff checking a total of 200 posts per month. Each partner has said it plans on expanding in the lead-up to the 2020 US presidential election. Facebook reported 190 million active users in the US and Canada at the end of 2019.

Facebook is fighting a losing battle here. There aren’t enough fact-checkers out there that they can hire to even make a dent in how much crap gets shared on their platform. The simple attempt to verify the facts of any articles or posts that get shared on their network can never scale enough to match the vast amount of things being shared by users, and things being targeted at certain groups and users.

There’s just no way. With that many users, it’s not feasible for any company to moderate what gets shared in any sort of effective way.

So how do we respond?

One, stop using social media as your sole news source.

Two, be the change you want to see on social media. Stop reacting, sharing, “liking” things that fit your world view, but have no actual basis in reality. Pause and think about what you’re sharing. Go and actually read the link and make sure the story matches the headline, or isn’t coming from a obvious fake news/satire site. Think about why you’re seeing something, who is directing it at you, and whether you trust them or not.

Just stop and think instead of acting out of emotion, especially if that emotion is outrage. These kinds of trolls target that emotion more than anything, because they know that’s our weakness. When outraged, our ability to think critically goes out the window. That’s how mobs form, online and off.

Mobs rarely keep all their facts straight.

Don’t be part of a mob. Don’t expect social media companies to fact check for you. Take some responsibility for your own little part of the internet.

Similar Posts

  • New version of AVG-Free

    Like I didn’t have enough to do after spending 5-days almost entirely disconnected from the Internet, and having work projects pile up on me while we were gone, when I finally did connect my laptop to the Internet this morning, I was prompted about version 7.5 of Grisoft’s free anti-virus tool, and informed that the…

  • UCITA

    I just got done listening to a round table discussion of UCITA. Now I’ll grant you, I don’t know alot about UCITA, but just using common sense I would have to question the ideas presented by the pro-UCITA people. (Who work for AOL by the way, but that did not make me anti-UCITA immediately *L*)…

  • Well that got their attention on Friendfeed

    Earlier today I opened up Friendfeed to see what people were talking about and was disgusted. So I posted this to the Friendfeed service: Definition of an echo chamber: of 28 “things” on my Friends page, 18 are about Friendfeed and/or Twitter themselves and 4 are about Techmeme. I need to unsub from some of…

  • |

    Linked: Data From Fake Legal Requests Used to Sexually Extort Minors

    In this case, we have an emergency process. There are good reasons to have that process, if someone is threatening violence to themselves on social media, it’s useful for the tech company to share some information with law enforcement so they can be reached. But, having the ability to get that kind of response from tech companies is also an invitation to hackers. If they can create a fake emergency request they can collect personal information about any user. They can then use that information to target that individual.

    When you create that kind of system, the request needs to be coming from a safe, verified, source. When the source is compromised, and the receiver doesn’t have an excellent validation process, bad things are going to happen.

    Because when you have that kind of data, people will try and do bad things with it.

  • |

    Linked – How Facebook Is Killing Comedy

    This is clearly true. “Facebook has created a centrally designed internet. It’s a lamer, shittier looking internet. It’s just not as cool as an internet that is a big, chaotic space filled with tons of independently operating websites who are able to make a living because they make something cool that people want to see.”…

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

To respond on your own website, enter the URL of your response which should contain a link to this post's permalink URL. Your response will then appear (possibly after moderation) on this page. Want to update or remove your response? Update or delete your post and re-enter your post's URL again. (Find out more about Webmentions.)