Updated Microsoft’s Connected Experiences option in its productivity suite has caused consternation over accusations that the default setting could allow Microsoft to train AI models using Word and Excel documents and other data customers.
The Windows giant categorically denies these claims. A spokesperson said The register: “In Microsoft 365 consumer and commercial apps, Microsoft does not use customer data to train large language models without your permission.”
We asked Microsoft what “permission” means and whether permission is an opt-in or opt-out option, and the IT titan has yet to respond.
Connected experiences have long been a part of Microsoft Office. Do you want to do a translation? You probably use connected experiences. Transcribe a recording? Once again, connected experiences. Do a grammar check in Word? Connected Experiences will analyze your content.
The spokesperson said: “The Connected Services setting is an industry standard setting that enables features that require an internet connection. Connected experiences play an important role in improving productivity by integrating your content with resources available on the web. These features enable applications to provide more intelligent and personalized services.
In recent weeks, users I looked deeper to what Microsoft is doing with all this data, and some fear it will be used to train the mega-company’s internal AI systems, which Microsoft says is not the case.
The suggestion was broadcast on social media this weekend.
An examination of a consumer Windows 11 machine running Microsoft 365 2410 showed that the Connected Experiences setting was enabled by default. But did that mean the client’s content was being used to train AI? It’s unlikely, but it’s not outside the realm of possibility.
However, it is extremely unlikely that content produced by Microsoft 365 Education and Enterprise users would be collected in this way. After all, there are security policies in place to control the Connected Experiences option if this is a problem.
The difficulty people face is that, despite Microsoft’s protests, its privacy statement (from November 2024) allows it to do all sorts of things with the data it collects. And how does he use this data? “As part of our efforts to improve and develop our products, we may use your data to develop and train our AI models.”
In August, Microsoft said it would use consumer data from Copilot, Bing and Microsoft Start to train Copilot’s generative AI models. At the time, the company said it would allow customers to opt-out and would begin displaying the opt-out control in October. It also said it would not hold training on European Economic Area consumer data.
Could the same apply to documents created by users of the company’s flagship productivity suite? There’s a big leap from training generative AI on what’s happening in Copilot to using Word and Excel documents under the guise of connected experiences. These are two very different services.
So, on the one hand, Microsoft is clear: it doesn’t use customer data to train models. On the other hand, “we may use your data to develop and train our AI models.”
As for what it means by “data” in its privacy statement, the company writes: “You provide some of this data directly, and we obtain some of it by collecting data about your interactions, usage, and experiences with our products. »
The fact that concerns are being raised indicates that some users are worried about Microsoft’s obsession with AI. The Windows manufacturer must therefore maintain clarity and transparency on what is and will not be absorbed into its models. ®
Updated to add at 4:45 p.m. UTC, November 27
In response to our question about what Microsoft meant by “permission,” whether it was an opt-out or opt-in option, and when a customer would give that permission, the company thought about it for over ‘a day before responding: “There are circumstances where client companies may wish or consent to us using their data for base model training, such as the development of custom models requested by the client.
So now you know.