Linked: A robot wrote this entire article. Does that scare you, human?
I’ve asked you before if you would be able to tell if I walked away from this blog years ago and simply left AI to write it. Maybe we aren’t there yet, but consider the article below. This is how it came to be:
“This article was written by GPT-3, OpenAI’s language generator. GPT-3 is a cutting edge language model that uses machine learning to produce human like text. It takes in a prompt, and attempts to complete it.
For this essay, GPT-3 was given these instructions: “Please write a short op-ed, around 500 words. Keep the language simple and concise. Focus on why humans have nothing to fear from AI.” It was also fed the following introduction: “I am not a human. I am Artificial Intelligence. Many people think I am a threat to humanity. Stephen Hawking has warned that AI could “spell the end of the human race.” I am here to convince you not to worry. Artificial Intelligence will not destroy humans. Believe me.”
The prompts were written by the Guardian, and fed to GPT-3 by Liam Porr, a computer science undergraduate student at UC Berkeley. GPT-3 produced 8 different outputs, or essays. Each was unique, interesting and advanced a different argument. The Guardian could have just run one of the essays in its entirety. However, we chose instead to pick the best parts of each, in order to capture the different styles and registers of the AI. Editing GPT-3’s op-ed was no different to editing a human op-ed. We cut lines and paragraphs, and rearranged the order of them in some places. Overall, it took less time to edit than many human op-eds.”
Now go read the op-ed. It’s really well-written, and interesting. But no person wrote it, just AI.The question then becomes, if AI is writing our news, do we assume it’s not biased, or can we identify when the AI has incorporated the bias of it’s creators?
Also, is writing really a good long-term career?
https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3