Linked – Microsoft Says Copilot Users Would Love It More If They Wrote Better Prompts

Linked – Microsoft Says Copilot Users Would Love It More If They Wrote Better Prompts

AI might eventually change the world. However, the technology that exists can be disappointing. Microsoft and other AI champions need to accept that and work to improve it. Telling people they aren’t prompting correctly isn’t how to get buy-in. It’s elitest for no reason.

Linked – No One Wants To Talk About These 3 Ways AI Copilots Will Reshape Learning
| |

Linked – No One Wants To Talk About These 3 Ways AI Copilots Will Reshape Learning

I’ve had an opportunity lately to play around with some AI tools, including Microsoft Copilot for 365, and I have to admit, when I have a question about how to do something, I ask Copilot. I ask Copilot because:

It’s right there while I’m working.

I don’t have to bring in another tool or trainer.

I don’t have to take a class or watch a YouTube video to learn a new skill. (Imagine a Copliot prompt like – “How would I do a VLOOKUP with this data using ID as the unique identifier?”)

Linked – In major gaffe, hacked Microsoft test account was assigned admin privileges
|

Linked – In major gaffe, hacked Microsoft test account was assigned admin privileges

As the article below points out, I bet this wasn’t a technical issue. It’s not a bug. It’s a poor configuration choice, yes, likely made worse by a poor change management process. Somewhere along the way, you’d think someone would have it written down that this existed, and someone would see it written down and act on it. That didn’t happen. You’d also like to think there would be a hard rule to enable MFA in any environment, including testing ones.