Microsoft

Microsoft Copilot for Microsoft 365 Enablement Experience

I’ve recently purchased Copilot for Microsoft 365 to play with on my own tenant, and wanted to share my experience. No, I did not use Copilot or any other AI to write this post :) Some of the below may sound picky, but I’m trying to be clear around names and functions as I found a lot of it hard to correctly define as I went. You’re going to see the word ‘Copilot’ a lot – sorry.

First, I’ll attempt to clarify that I’m only looking at Copilot for Microsoft 365. What is “Microsoft Copilot” is a harder question to get your head around, because Bing Chat for Enterprise/Bing Chat Enterprise is now just called Copilot. Copilot for Microsoft 365 is Copilot integrated into the Microsoft 365 apps – so think of Copilot as the AI solution itself, and “Copilot for X ” as anything else as Copilot being integrated with, and able to use some of the data in it, as well as giving answers more contextual to that solution (but not limited to!).

I found this really good graphic of Microsoft Copilot but can’t find the original source!


Alright, so looking at the options below we have Copilot which is free, Copilot for Microsoft 365 which you pay for per user and integrates into Microsoft 365 apps… and there’s also Copilot Pro which gives you integration with some of the Microsoft 365 apps and a few extra base Copilot perks. Copilot Pro is for consumers and targeted at individuals, you can’t buy this against a business account.

In my own tenant of 1 active user, I purchased Copilot for Microsoft 365. I had to do this for a year because it was either that, or 3 years. My tenant is quite old, US based, and also has some trial/unique test licenses applied from Microsoft.

Just for reference too, Copilot for Microsoft 365 data and processing stays within the service boundary, and is not read or used by Microsoft in any other way.

Source: https://learn.microsoft.com/en-us/microsoft-365-copilot/extensibility/ecosystem

Note that despite the official diagram above’s title is “Microsoft 365 Copilot”, that was the name at launch and it is now “Microsoft Copilot for Microsoft 365”

The purchasing and assigning of Copilot for Microsoft 365 is somewhat of a non-event like most other licenses. I was able to buy a single license, as Microsoft reduced the 300 minimum requirement in December 2023. From the Microsoft 365 admin center, you have the following subservices available:

From the user side, I just noticed the Copilot options turn up as soon as I went to check them. The first thing I wanted to work out, was how do I interface with Copilot when it’s app agnostic? Here’s where the fun started;

https://copilot.microsoft.com, https://www.microsoft365.com/chat/, and https://bing.com/chat look pretty much the same, two have a ‘work’ option to make sure you’re processing data in your own tenant and not giving sensitive information back to Microsoft in other ways… and the third requires a work login to access – so what’s the difference?

Honestly I’m not sure. I tried working this out but failed – maybe there isn’t one? It will save what you searched, and that history will be visible across all three chat solutions:

I was going to use this example question of ‘how many emails did I get yesterday’ as my next point of frustration – when I have been testing this for the last few days, it’s incorrectly told me each time that I’ve received just 1 email. Then when asking it about another email I received in that yesterday timeframe, agreeing that also was received yesterday. When questioning it around the discrepancy, it stopped the conversation.

However, it looks like that’s already been resolved:

I am glad it’s now not giving an incorrect answer, but I did expect it to be able to actually answer this question with a correct number. We’ll move on to finding some information I know is in there:

I asked to see ALL the emails where I’ve passed a Microsoft exam. It found one from 2008, something else irrelevant to exams, and then 6 other emails for me to check out.

Note that I have a bunch of emails that should quite clearly be picked up, such as this one from February this year and I found by doing a search of the word ‘Exam’ on my mailbox:

How about certifications then? For some reason it did find a much newer email about a Certification renewal (the word ‘Exam’ wasn’t on that one). When I asked to ‘show me more emails’ it proceeded to just do what I asked out of context, and showed me 5 example emails from my mailbox unrelated to the previous query.

These experiences are frustrating – partly because I can see the amazing potential Copilot has (honestly I can!), but also how it can consistently miss the mark of my expectations around it. It could be that my expectations are too high – but if they are, then it’s Copilot’s job to set them correctly as part of it’s answer system. And no, putting a label saying ‘AI-generated content may be incorrect’ at the bottom of everything doesn’t quite cut it.

OK, how about asking Copilot for Microsoft 365 about files? Asking it what I accessed last is correct, and matches what I see on the ‘My content > All’ section of Microsoft 365. However, that fell over quickly when I asked it was folders were in the root of my OneDrive – it claimed it’d have to do a search for that (why don’t you just go search then?), as well as showing me an ini file I’d opened 6 days ago for Diablo IV – and for reference, that was the 7th last file I’d accessed. Asking it to search for the folders resulted in it telling me that now it had done that search but couldn’t list the folders. Taking it’s next suggestion, I asked it to list the contents of a folder called Work – I’d created it a few days ago and it has only ever had 1 file in it. The results came back incorrect again, claiming the presentation2 file which as per the first result, was in a folder called ‘Documents’ and not in a folder called ‘Work’.

OK, enough digging for data.

My other surprise on purchasing the Copilot for Microsoft 365 license was receiving a call at about 5am, which although I woke up to, did not answer in time. I called it back, heard a recording saying it was Microsoft, and assumed it was a scam call. Checking my emails later, I noticed that a case had been logged in my name at 1:06am called “Getting started with Copilot, we’re here to help.”

I logged onto the support area of the Microsoft 365 Admin portal, and yes there was a ticket under my name, with my mobile phone number (including the +61 area code for Australia) that had been logged for me. Yes, I have notes on this experience:

  1. Although Microsoft Support can be used for both break/fix and advisory calls, it should not be used as a marketing tool to proactively ensure a customer is getting value from something they only just purchased. In other words, don’t shoehorn a solution to a problem you see, into a different system not designed for that.
  2. Don’t list it as the customer doing the action themselves if you automate something on their behalf.
  3. Don’t put down on my behalf that I’d like a phone call about the ticket you logged pretending to be me.
  4. Have a look at the customer’s timezone and call them during business hours.
  5. Who came up with the incident title? At least start with ‘Auto generated’ – you’ve used the title as a way to communicate to me when that’s very much not what the title of an incident is supposed to do.
  6. After calling me and waking me up, don’t send an email asking me to respond, but if I don’t you’ll call back again, but claim to do so ‘again’ during business hours.
  7. Don’t use the Status of ‘Feedback’ when it really isn’t – I’m probably not going to have feedback a few hours after enabling the service (but give me a few days to write up a blog post!).

Support also advised me that “A ticket is logged when copilot is purchased” and proceeded to give me a bunch of links about Copilot for Microsoft 365 anyway. Seems like that could have just been the email they sent without all the other noise.

There was one good link in there which was about Copilot prompts – worth a quick look but seriously, why isn’t this just linked at the top of the Copilot prompt area? There’s a lot of white space this link could go in.

I’ve had a few others claim similar experiences when enabling Copilot for Microsoft 365 including Microsoft MVP Karen Lopez:

Yikes.

I know I’ve banged on about frustrations here, but my general point is to try and set realistic expectations around the current state of Copilot for Microsoft 365. It is not a magical answer to doing most your work for you. It is really good at writing responses for you as either starters, frameworks, or mostly done content to fine tune. It’s really good at summarising emails. It’s really good at responding to something you don’t want to spend time on – I was ‘invited’ to attend free LinkedIn Workshops to help me put content on that platform, and clicking reply brings up a great Copilot experience – auto answer type buttons depending on the response I want to give, an area to get Copilot to help me draft a response, or I can just start typing and Copilot stays out of the way.

Although I can’t think of many situations that a poem would be my response, it’s one of those options you have to try:

So yes, these sorts of functions are hugely valuable just for these sort of use cases on email. It also does a lot of great stuff in Word, PowerPoint, and Excel, along with Outlook as above – but those deserve their own posts. Copilot, and in turn Copilot for Microsoft 365 is going to get better at a hugely accelerated rate, and the items that are less focused on purely the language side of LLM and a bit more data based will be valuable. And, despite my criticisms above, I still think everyone should buy or at least try this to learn, get ready, and understand what is actually possible right now in our era of AI – just make sure your environment is ready for it with the right controls, processes, and security in place.

Cloud.Microsoft is coming (and already here a bit)!

Microsoft has been planning to migrate Microsoft 365 services to a new domain – cloud.microsoft – for over a year.

Back in April 2023, Microsoft announced the upcoming change with a starting sentence: “…today we’re excited to announce that Microsoft is beginning to reduce this fragmentation by bringing authenticated, user-facing Microsoft 365 apps and services onto a single, consistent and cohesive domain: cloud.microsoft.”

As pointed out to me by Microsoft MVP Karl Wester-Ebbinghaus, who in turn was reading this post from Dr Windows aka Martin Geuß, there is now an update on the Microsoft 365 Message Center called “Product transitions to the cloud.microsoft domain – February 2024” Message ID MC724837 (published on March 5th which is still almost February). It calls out that the new domains are starting to go live, in parallel with existing domains – meaning you won’t get redirected to the new ones yet.

A list of services that are already running on a cloud.microsoft domain are documented here: https://learn.microsoft.com/microsoft-365/enterprise/cloud-microsoft-domain which at the time of writing looks like this:

List of live cloud.microsoft subdomains as of 12/03/2024

As Microsoft has exclusive rights to the .microsoft top-level domain, any content on here can be held at a pretty high standard. Make your own decisions around what you may allow from the single .microsoft doamin, or the initial sub-domain of cloud.microsoft. You may need to add the domain/subdomain to allow lists.

What the above changes also mean for me personally, is a lot of ongoing work on MSPortals.io to keep it up to date, as well as keep the old links on there while they still function:

I’ll do my best to keep MSPortals.io as updated as possible, but if you notice anything that needs an update, please contact me or use the GitHub option on the site to submit an update.

Other notes and take aways from the message center post:

It appears the planned end-dates of non cloud.microsoft URLs for Microsoft 365 services is somewhere between June 2024 and September 2024.

Follow the guidance on Microsoft 365 URLs and IP address ranges and there should be no network administrative impact to these changes.

Update documentation and communicate the change to end users – this can be a good chance to train or rehash what domains are, which helps in user understanding of phishing attempts (both web based and email).

If you have any tools build that connect to Microsoft 365 services (3rd party, or internally developed) make sure they’re aware of the upcoming changes and have a plan to update.

Microsoft Learn GitHub and Feedback Updates

Microsoft is changing the way feedback will be provided for Microsoft Learn content.

Microsoft Learn is an impressive resource for IT staff interacting with Microsoft technologies. It was first launched as docs.microsoft.com which came out all the way back in 2016. Before that, TechNet and MSDN were the sources of official Microsoft documentation, but they were incredibly lacking in both quality and quantity of information. It’s why most people relied on third party websites to find out how to ‘really’ do something in the Microsoft space – which is why it was great to see Microsoft spend time and money in something that gave them no immediate return on investment.

Microsoft Learn was built on customised GitHub architecture, allowing huge transparency on when documentation gets updated, what changed, and a way for customers to question and/or correct what they’re reading. It was also a pseudo feedback method where you could see what others may be complaining providing constructive criticism about when looking at a product yourself – similar to what Feedback Portal does for each product (which is still in beta, and replaced the decent third party UserVoice service) – but when you’re looking at feedback on a particular documentation page on a specific thing, the feedback you’re seeing is particularly relevant, rather than searching through an entire product’s history of feedback.

History lessons aside, Microsoft is now rolling out a change on how feedback works. It’s a bit of a mixed bag from what I can tell, so here’s the breakdown:

From the updated information on Provide feedback for Microsoft Learn content, there will be a few different options on what’s possible around providing feedback based on what page it is.

All pages will have the new feedback experience where you click the thumbs up Feedback button:

This will let you anonymously provide feedback. A single text box that you can write your thoughts on and submit into a black box:

I don’t like this because there’s no visibility, accountability, or any way I can actually engage with Microsoft. I can see why Microsoft wants this, but the old GitHub feedback method meant you could get a response, converse, clarify etc. That is completely gone with this method and personally I doubt I’d bother using it beyond a Yes/No response and maybe a 1 line. It doesn’t provide the customer with any real incentive to bother.

There is some good news however. Some pages will be configured to take you to the relevant Product Feedback page, and some will take you to a Q&A page for the product or community site. If these were widely implemented, it would go a long way to fill the above feedback gap.

Also, you can still use the pencil icon to submit changes and view page history… “for any repository that already had this capability enabled.“.

That implies any new repository (likely for any new product that doesn’t have it’s own content on Microsoft Learn yet) will not have this capability. Except, I can already see a repository that doesn’t have this capability – Purview related content. Check out any Purview page on Microsoft Learn such as Learn about data loss prevention | Microsoft Learn and you’ll notice there is no edit pencil, and feedback at the bottom of the page only has the new experience:

Compared to other pages such as this Publish on-premises apps with Microsoft Entra application proxy – Microsoft Entra ID | Microsoft Learn where the callout of the deprication of GitHub Issues is.

It is also worth noting that open source products will have a more open feedback experience using GitHub. A list of products that support this is available here and appears to be the same as the way we’ve been using feedback across the entire Microsoft Learn platform for a while.

Overall, I’d be guessing that the existing solution creates a lot of noise for Microsoft to manage based on the amount of feedback they’d get, and this is a way to stop it. If we see improvements in the other two-way feedback mechanisms, including Microsoft staff engaging more on these platforms, I can see it working well enough. Let’s hope that happens!

AI Powered Microsoft Q&A vs Bing Chat vs Bing Chat for Enterprise (Copilot)

Update 20th November 2023
Bing Chat for Enterprise has been renamed to ‘Copilot with commercial data protection‘ – General Availability 1st December 2023.

Original Post
Q&A Assist is a new feature Microsoft have launched on the Q&A ‘Ask a question‘ page, where you would normally pose a question to post in the forums and have another human answer for you. Now, backed by the Azure OpenAI Service, you can get AI based answers using data that Microsoft curates.

This is a bit different to Bing Chat (or Bing Chat for Enterprise) where it’s using knowledge from all over the internet, and as per any OpenAI setup, should be tailored a bit more to the sort of questions it expects.

Q&A Assist at the time of posting is in ‘Public Preview’:

I thought it would be worth comparing the two to see how they fare, but it took me down a bit of a different path than I expected.

The Example

Q&A Assist gave a fairly reasonable broad response and expected you to dig more into it only via official learn.microsoft.com content.

Bing Chat however, took me down a bit of an interesting path. It gave a step by step:

But that didn’t scale or have the automation of the above answer, so I tried to clarify:

Not too bad, but not the same answer as Q&A Answers – both valid depending how you buy your Windows 11 Enterprise licenses though. What if I limit Bing Chat to only use learn.microsoft.com content?

Proof that AI doesn’t do everything for you – OK I ask the same question piecing all the bits together:

The same answer as before but only from learn.microsoft.com? This gets stranger when I check reference 1, which is actually a Q&A page with the quesiton “Which Windows 11 version allows multiple remote desktop sessions” and doesn’t have anything about VAMT at all. Reference 2 which strangely tells me to do what I’ve already done on this query, links to another Q&A page which is on topic, but has no content that would have been helpful for this answer. Something wacky going on with those reference links, but I suspect it actually used the information in the same session and then limited the claims on where it could verify those answers to learn.microsoft.com only, which if you only saw this single answer woudn’t be right.

Is Bing Chat for Enterprise Different?

I pumped the same final all-encompassing question in, and received probably the best answer out of everything, great sources and almost only limited to learn.microsoft.com – a Youtube link turned up, but that was from one of the Q&A pages.

Giving Bing Chat another chance, I started a new session and asked the same question again:

Different again, but you can see Bing Chat gives more ‘consumery’ answers while Bing Chat for Enterprise didn’t – I was surprised by this but it does make contextual sense. The references also make sense this time, so this leans towards my theory on using previous answer information in the same question thread – something to be aware of.

Coming back from that tangent, what does this all mean for Q&A Assist? It’s good that it helps define a question and ask in both summary and detailed, needing a category and limiting answers only to trusted sources. You can see the design of it is to hopefully provide a quick answer before someone posts the forum question, or at least supplement their question with extra details on what they might be trying to ask.

Moreso, it’s a good example of what is fairly easy to achieve with Azure OpenAI pointed at a set of data – which could purely be a website. It takes a chatbot to the next level by not needing anyone to give it a set of questions and answers, it’ll work all that out itself. It’s also worth nothing that even in the Microsoft ecosystem there are multiple AI chatbot solutions, such as Power Pages also being able to point a chatbot to a page to do Q&A type work.

The hard habit to break for many people will be years of using a search engine to look up an answer and doing your own work going through it – any AI driven chat system should make this easier and more effiencent to look up detailed questions and follow the sources to get your truth, but it’s something that we’ll all need to get used to while becoming more ingrained with everything we do online.

MSPortals.io Analytics

I thought it might be interesting to share some stats/trends around https://msportals.io which currently uses Google Analytics. Most sites have a commercial aspect and don’t like to share this data, but as it’s purely community and no financial gain, let’s check out some stats:

Last 7 days from 31st May (Monday):

Last 28 days from 10th May:

Last 12 months:

All time – from October 2021 to June 2023.

Unsurprisingly, there is a constant peak/trough for weekdays and weekends. I’m not sure why it’s more evident over the ‘all time’ stats vs ‘last 12 months’, but ’28 days’ and ‘7 days’ show a good reflection of this. Those giant peaks on the ‘all time’ are from either a news article posting about the site, or someone having a very successful social media post bringing attention to msportals.io.

There is also a pretty steady user count between 1500 and 2000 a day, excluding weekends.

Where are users coming from? (last 90 days)

Another unsurprising statistic is that most users are coming from the US – UK is next, and probably more surprising is Australia being third – maybe because I have a wider audience and more connections here?

US is the first most common US city in 7th place, while London is 1st, which I’m sure matches the expected stats due to population distribution.

Which pages are most hit? (last 90 days)

Still more unsurprising stats, the main page accounts for the most hits, which contains the standard Microsoft Admin portals. Next up is the Government portals, which is only US Gov – so there is obviously fairly high usage of those; double the stats of the user page which I did think would be a bit more widespread – but I expect the waffle from office.com serves most users quite well.

How do users get to msportals.io? (last 90 days)

Most have the site bookmarked, or are typing the URL directly into their browser. The next most common is via search engine – testing via private browser mode, searching for ‘Microsoft Portals’ brings up msportals.io as the first result on both Bing and Google, but I can’t see any stats on what search terms refer people to my site the most.

Average Engagement Time (last 90 days)

If someone visits the main msportals.io site, the average engagement time is 36 seconds (based on the last 90 days). Most sites will want higher engagement times, but the point of this site is to get people to where they want to get to as quickly as possible, so I’m pretty happy with 36 seconds as an average. Other pages have similar times, although I have no idea how language conversion is happening, or why what I assume is the French language ‘Portails adminitraeur | Portails Microsoft’ has more than 2 minutes engagement time despite France not being in the top 7 countries (I’ll blame Canada – sorry).

Tech – Device, Platform (last 90 days)

These stats I find quite interesting. No surprise that Windows is vastly the main OS used to access msportals.io, with similar numbers of Macs vs iOS users, and slightly behind that, Android. There’s 90% desktop users vs 10% mobile users – rounding to nearest number and ignoring the 0.3% of tablet users.

Very similar browser stats on Edge vs Chrome (which compared to the stats for the sites’ entire life, Chrome has been used slightly over 2x as much as Edge, which shows Edge’s usage drastically increasing for at least my sites’ user base), and fair way behind are similar usage stats for Safari vs Firefox (and again comparing since the site launched, that’s been similar the whole way along with a tiny bit more Safari).

Screen resolutions I am happy to see the standard 1920 x 1080 being far ahead. Quad HD is second, with a bit of ultrawide 5th on the list. Again, historically 1920 x 1080 has always been far ahead, but 1366 x 768 makes up second place with half the amount of 1920 x 1080 hits – yet in the last 90 days, it’s not even top 7 so there must be a lot of monitor or laptop upgrades recently :)

I hope those stats gave you some insights into both what msportals.io sees, and also very easily what any site can learn about it’s visitors – this is using Google Analytics, without any costs involved.