|

Linked – This Is A Really Bad Idea: Facebook, Twitter, YouTube & Microsoft Agree To Block ‘Terrorist’ Content

Image by Frau Hölle
Image by Frau Hölle

“This sounds as though it’s modeled on similar arrangements around child pornography. Except that there are some major differences between child pornography and “terrorist content.” The first is that child porn is de facto illegal. “Terrorist content” is quite frequently perfectly legal. It’s also much more of a judgment call. And based on this setup, allowing one platform partner to designate certain content as “bad” will almost certainly result in false positive designations that will flow across multiple platforms. That’s dangerous.

As we’ve discussed in the past, when you tell platforms to block “terrorist” content, it will frequently lead to mistakes, like blocking humanitarians documenting war atrocities. That kind of information is not just valuable, but necessary in understanding what’s happening. “

It is becoming very trendy to suggest that these social platforms must “do something” to prevent people from having, in essence, to see information they might not want to see, or they may not want others to see. Whether you are talking about “terrorist” content, hate speech or “fake” news, the question always comes back the same thing. Who decides what is appropriate and what isn’t, and what basis are they using for that decision? Sure, we can maybe find some obvious stuff that we can get agreement on, but eventually there’s going to be disagreement, and then what? How do I get my content put back if it gets marked as any of those things?

Who’s watching to make sure “safe” social networks don’t become completely void of free speech?

https://www.techdirt.com/articles/20161205/17450136200/this-is-really-bad-idea-facebook-twitter-youtube-microsoft-agree-to-block-terrorist-content.shtml

Similar Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.