Caution Tape

Linked – Microsoft Says Copilot Users Would Love It More If They Wrote Better Prompts

Oof. It pains me when technologists respond to criticism with the classic “It’s great when you learn how to use it” because it’s so short-sighted. It’s a great way to get your customers to give up on using the product instead of being an excellent opportunity to see the pain points and improve your product.

I’ve seen similar comments about most Generative AI tools, too. As a trainer, I agree that most technology doesn’t magically work without the user getting at least some instruction. On the other hand, AI companies have suggested that their AI chatbot is easy to use and then respond angrily when users criticize it.

Think of it like this:

Sales/Marketing materials: Copilot is easy to use. Provide some basic information, and it will create your draft document.

User: I asked it for information about “x,” and the information was inaccurate. The writing wasn’t very good, and I spent more time editing the draft than I would have spent writing it from scratch.

Company Support: You’re not using it correctly. Learn how to ask better.

That’s not an excellent customer service interaction.

On the other hand, Microsoft has invested billions of dollars in Copilot. It has to sell. If it fails, heads will roll. OpenAI and other companies are in the same boat. If their AI tool fails, they’re finished. They have a vested interest in AI becoming the world-changing technology they claim it will be. Negativity about AI will not be tolerated. It can’t be.

We’ve seen this before. Ask anyone who dared question blockchain, NFTs, the metaverse, or crypto. The criticism was met with similar responses, usually as “you don’t get it.” This response was meant to dismiss your concerns as uneducated, and yet there were severe problems with those emerging technologies, and we’ve seen them play out. Let me give you an example of a problem that exists. I gave Copliot a URL and asked it to summarize the article at that link. It was a very eloquent and detailed summary. Unfortunately, none of the information in the summary matched the article. It seems that Copilot took the words from the URL rather than following the link, searching the Web, finding other websites, and providing a summary. That’s not a poor prompt. That’s a misunderstanding of what I instructed it to do.

AI might eventually change the world. However, the technology that exists can be disappointing. Microsoft and other AI champions need to accept that and work to improve it. Telling people they aren’t prompting correctly isn’t how to get buy-in. It’s elitist for no reason.

Similar Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.