#Fail

Another Reason to Doubt Facebook will get Fake News Right – They Can’t Get Illicit Images Right

This story is appalling on many levels.

It began with a BBC investigation in 2016, which found that paedophiles were using secret groups on Facebook to post and swap sexually suggestive images of children. Facebook promised improvements to its moderation policy, and said it was employing “thousands” of moderators to check the content 24/7.

Having decided to check this claim a year later, the BBC used Facebook’s own system of reporting abuse to question about 100 images, only to find that just 18 were removed as a result. When the BBC approached the network about the findings things took a decidedly Kafkaesque turn.

According to the BBC, Facebook’s director of policy Simon Milner agreed to be interviewed, but only on condition that the journalist provided examples of the material that had not been removed by moderators. When the BBC forwarded screengrabs of the images, Facebook’s legal team went straight to the police.

In a response to the Guardian, the social network did not confess to an error of judgment but insisted that the law demands a referral to the police when such images are “shared”. A spokesman said: “It is against the law for anyone to distribute images of child exploitation.”

First off, yes the journalist should know better than to “share” images like this, technically that is illegal. I can’t help but wonder if Facebook though, is guilty of soliciting that sharing? Any attorneys want to chime in on this?

But more upsetting is that despite thousands of moderators, Facebook seems incapable of removing even reported images. This is a problem because Facebook seems to want to have it’s cake and eat it to. When challenged with this sort of failure, they seem to want to fall back on the fact that they can’t monitor user-generated content, they are only the “platform”, not the ones creating the content, yet at the same time they want to use their algorithm and their new “disputed news” tagging system to make editorial decisions about the content being shared by it’s users. (Let alone the fact that they are delving into their own content creation…)

Simply put, you can’t hide behind the platform excuse while also promoting yourself as “the place” to get your news as decided by Facebook’s staff/algorithms. Either you are acting as an editor and you fix where you fail at that, or you act like a platform and let people post and share whatever.

So, Facebook, how about you do something about potentially illegal content being shared on the site before you worry about being the editor of the world’s news, ok?

Similar Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.