OpenAI

Microsoft Copilot for Microsoft 365 Enablement Experience

I’ve recently purchased Copilot for Microsoft 365 to play with on my own tenant, and wanted to share my experience. No, I did not use Copilot or any other AI to write this post :) Some of the below may sound picky, but I’m trying to be clear around names and functions as I found a lot of it hard to correctly define as I went. You’re going to see the word ‘Copilot’ a lot – sorry.

First, I’ll attempt to clarify that I’m only looking at Copilot for Microsoft 365. What is “Microsoft Copilot” is a harder question to get your head around, because Bing Chat for Enterprise/Bing Chat Enterprise is now just called Copilot. Copilot for Microsoft 365 is Copilot integrated into the Microsoft 365 apps – so think of Copilot as the AI solution itself, and “Copilot for X ” as anything else as Copilot being integrated with, and able to use some of the data in it, as well as giving answers more contextual to that solution (but not limited to!).

I found this really good graphic of Microsoft Copilot but can’t find the original source!


Alright, so looking at the options below we have Copilot which is free, Copilot for Microsoft 365 which you pay for per user and integrates into Microsoft 365 apps… and there’s also Copilot Pro which gives you integration with some of the Microsoft 365 apps and a few extra base Copilot perks. Copilot Pro is for consumers and targeted at individuals, you can’t buy this against a business account.

In my own tenant of 1 active user, I purchased Copilot for Microsoft 365. I had to do this for a year because it was either that, or 3 years. My tenant is quite old, US based, and also has some trial/unique test licenses applied from Microsoft.

Just for reference too, Copilot for Microsoft 365 data and processing stays within the service boundary, and is not read or used by Microsoft in any other way.

Source: https://learn.microsoft.com/en-us/microsoft-365-copilot/extensibility/ecosystem

Note that despite the official diagram above’s title is “Microsoft 365 Copilot”, that was the name at launch and it is now “Microsoft Copilot for Microsoft 365”

The purchasing and assigning of Copilot for Microsoft 365 is somewhat of a non-event like most other licenses. I was able to buy a single license, as Microsoft reduced the 300 minimum requirement in December 2023. From the Microsoft 365 admin center, you have the following subservices available:

From the user side, I just noticed the Copilot options turn up as soon as I went to check them. The first thing I wanted to work out, was how do I interface with Copilot when it’s app agnostic? Here’s where the fun started;

https://copilot.microsoft.com, https://www.microsoft365.com/chat/, and https://bing.com/chat look pretty much the same, two have a ‘work’ option to make sure you’re processing data in your own tenant and not giving sensitive information back to Microsoft in other ways… and the third requires a work login to access – so what’s the difference?

Honestly I’m not sure. I tried working this out but failed – maybe there isn’t one? It will save what you searched, and that history will be visible across all three chat solutions:

I was going to use this example question of ‘how many emails did I get yesterday’ as my next point of frustration – when I have been testing this for the last few days, it’s incorrectly told me each time that I’ve received just 1 email. Then when asking it about another email I received in that yesterday timeframe, agreeing that also was received yesterday. When questioning it around the discrepancy, it stopped the conversation.

However, it looks like that’s already been resolved:

I am glad it’s now not giving an incorrect answer, but I did expect it to be able to actually answer this question with a correct number. We’ll move on to finding some information I know is in there:

I asked to see ALL the emails where I’ve passed a Microsoft exam. It found one from 2008, something else irrelevant to exams, and then 6 other emails for me to check out.

Note that I have a bunch of emails that should quite clearly be picked up, such as this one from February this year and I found by doing a search of the word ‘Exam’ on my mailbox:

How about certifications then? For some reason it did find a much newer email about a Certification renewal (the word ‘Exam’ wasn’t on that one). When I asked to ‘show me more emails’ it proceeded to just do what I asked out of context, and showed me 5 example emails from my mailbox unrelated to the previous query.

These experiences are frustrating – partly because I can see the amazing potential Copilot has (honestly I can!), but also how it can consistently miss the mark of my expectations around it. It could be that my expectations are too high – but if they are, then it’s Copilot’s job to set them correctly as part of it’s answer system. And no, putting a label saying ‘AI-generated content may be incorrect’ at the bottom of everything doesn’t quite cut it.

OK, how about asking Copilot for Microsoft 365 about files? Asking it what I accessed last is correct, and matches what I see on the ‘My content > All’ section of Microsoft 365. However, that fell over quickly when I asked it was folders were in the root of my OneDrive – it claimed it’d have to do a search for that (why don’t you just go search then?), as well as showing me an ini file I’d opened 6 days ago for Diablo IV – and for reference, that was the 7th last file I’d accessed. Asking it to search for the folders resulted in it telling me that now it had done that search but couldn’t list the folders. Taking it’s next suggestion, I asked it to list the contents of a folder called Work – I’d created it a few days ago and it has only ever had 1 file in it. The results came back incorrect again, claiming the presentation2 file which as per the first result, was in a folder called ‘Documents’ and not in a folder called ‘Work’.

OK, enough digging for data.

My other surprise on purchasing the Copilot for Microsoft 365 license was receiving a call at about 5am, which although I woke up to, did not answer in time. I called it back, heard a recording saying it was Microsoft, and assumed it was a scam call. Checking my emails later, I noticed that a case had been logged in my name at 1:06am called “Getting started with Copilot, we’re here to help.”

I logged onto the support area of the Microsoft 365 Admin portal, and yes there was a ticket under my name, with my mobile phone number (including the +61 area code for Australia) that had been logged for me. Yes, I have notes on this experience:

  1. Although Microsoft Support can be used for both break/fix and advisory calls, it should not be used as a marketing tool to proactively ensure a customer is getting value from something they only just purchased. In other words, don’t shoehorn a solution to a problem you see, into a different system not designed for that.
  2. Don’t list it as the customer doing the action themselves if you automate something on their behalf.
  3. Don’t put down on my behalf that I’d like a phone call about the ticket you logged pretending to be me.
  4. Have a look at the customer’s timezone and call them during business hours.
  5. Who came up with the incident title? At least start with ‘Auto generated’ – you’ve used the title as a way to communicate to me when that’s very much not what the title of an incident is supposed to do.
  6. After calling me and waking me up, don’t send an email asking me to respond, but if I don’t you’ll call back again, but claim to do so ‘again’ during business hours.
  7. Don’t use the Status of ‘Feedback’ when it really isn’t – I’m probably not going to have feedback a few hours after enabling the service (but give me a few days to write up a blog post!).

Support also advised me that “A ticket is logged when copilot is purchased” and proceeded to give me a bunch of links about Copilot for Microsoft 365 anyway. Seems like that could have just been the email they sent without all the other noise.

There was one good link in there which was about Copilot prompts – worth a quick look but seriously, why isn’t this just linked at the top of the Copilot prompt area? There’s a lot of white space this link could go in.

I’ve had a few others claim similar experiences when enabling Copilot for Microsoft 365 including Microsoft MVP Karen Lopez:

Yikes.

I know I’ve banged on about frustrations here, but my general point is to try and set realistic expectations around the current state of Copilot for Microsoft 365. It is not a magical answer to doing most your work for you. It is really good at writing responses for you as either starters, frameworks, or mostly done content to fine tune. It’s really good at summarising emails. It’s really good at responding to something you don’t want to spend time on – I was ‘invited’ to attend free LinkedIn Workshops to help me put content on that platform, and clicking reply brings up a great Copilot experience – auto answer type buttons depending on the response I want to give, an area to get Copilot to help me draft a response, or I can just start typing and Copilot stays out of the way.

Although I can’t think of many situations that a poem would be my response, it’s one of those options you have to try:

So yes, these sorts of functions are hugely valuable just for these sort of use cases on email. It also does a lot of great stuff in Word, PowerPoint, and Excel, along with Outlook as above – but those deserve their own posts. Copilot, and in turn Copilot for Microsoft 365 is going to get better at a hugely accelerated rate, and the items that are less focused on purely the language side of LLM and a bit more data based will be valuable. And, despite my criticisms above, I still think everyone should buy or at least try this to learn, get ready, and understand what is actually possible right now in our era of AI – just make sure your environment is ready for it with the right controls, processes, and security in place.