Skip to main content
microsoft-copilot ai-adoption comparison

Microsoft Says Copilot Is for Entertainment Only. They're Charging You $99/Month.

By amaiko 7 min read
A crumbling theater stage with a corporate office desk in the spotlight
Listen to episode amaiko and Andrew discuss this article

Buried in Microsoft’s Terms of Use for Copilot is a sentence worth reading twice:

Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

Not a draft someone forgot to clean up. The ToS archive shows this language across multiple versions — it’s been Microsoft’s legal position all along. The same company running full-page ads about “AI-powered productivity” has quietly instructed its lawyers to classify the product as a toy.

One Hacker News commenter found the right analogy: “The new ‘for tobacco use only’ of tech.”

The Most Expensive Entertainment Subscription in Tech

Microsoft’s pricing for this entertainment product has become a landscape unto itself:

TierPriceWhat You Get
Copilot ChatFreeBasic web chat, included with M365
Copilot Pro$20/monthIndividual productivity features
Copilot Business$18/month (promo), $21 standardSMBs up to 300 users
Copilot Enterprise$30/monthEnterprise organizations
Copilot Studio$200/pack/monthCustom agent building

The part the pricing page buries: every paid tier requires a separate Microsoft 365 base license — $6 to $57 per user per month on top. The real cost of putting Copilot on a single enterprise employee’s desk runs $36 to $87 per month.

Netflix costs $15.49. Spotify, $11.99. Disney+, $13.99. All three deliver on their promise and none carry an “entertainment purposes only” disclaimer — because they don’t need one. They’re actually entertainment products.

Copilot costs two to six times more. And it’s the only one whose Terms of Use can’t promise it won’t infringe someone’s copyright.

On Microsoft’s Copilot enterprise page, bold claims: “100%+ Projected ROI with potential payback in 10 months.” Below that: “8+ hours of projected time savings per user per month.”

On the Terms of Use, in all caps:

WE DO NOT MAKE ANY WARRANTY OR REPRESENTATION OF ANY KIND ABOUT COPILOT.

Both pages belong to Microsoft. Both are live right now. Only one is a legal document that would hold up in court.

As Hacker News user andy81 put it: “The only thing clear about that License agreement is it contradicts all their other marketing about Copilot. So either that document is fraudulent or everyone else at Microsoft is committing fraud daily.”

It goes deeper. Earlier this year, Microsoft tried to remove AI disclaimers from Copilot responses in M365, citing user “distraction.” Five days and one backlash later, they reversed course. The disclaimer “will always be visible,” they said — quietly admitting the warnings are load-bearing, not cosmetic.

Why “Entertainment” Is Exactly the Point

This isn’t sloppy drafting. It’s architecture — legal architecture.

By classifying Copilot as entertainment, Microsoft insulates itself from product liability claims. If Copilot hallucinates a financial figure in your board presentation, generates content that infringes someone’s copyright, or gives your employee dangerously wrong advice — that’s your problem. You were warned. You used an entertainment product for serious work.

Security researcher Reed Mideke nailed it: “Microsoft has no idea how to stop prompt injection or hallucinations, which makes it fundamentally unfit for almost anything serious. The solution? Shift liability to the user.”

This leaves enterprise buyers in an impossible position. The marketing team sells Copilot as essential infrastructure. The legal team calls it a toy. When something goes wrong — and with AI, something always does — only one of those documents will matter in a courtroom.

96.7% of Users Agree

The market has voted. Only 3.3% of Microsoft 365 users actually pay for Copilot — 15 million seats out of more than 450 million. Of those who paid, barely a third activated it. Of organizations that ran pilots, only 5% moved to broader deployment.

Read those numbers again: 96.7% of the M365 installed base looked at Copilot and said no.

Microsoft’s response has been panic disguised as strategy. They cut the business price to $18 (promotional, expires June 2026). They shoved Copilot buttons into Notepad, Paint, and File Explorer — until users complained and the features were walked back. Microsoft’s share price dropped 15% in early 2026 as investors grew “weary of massive AI capital expenditures that have yet to yield a proportional explosion in revenue.”

As we wrote when the price cut happened: cheaper Copilot still won’t fix your AI problem. The issue was never the price tag. It’s what’s behind it.

Copilot Cowork: The $99 Admission

When your own AI is legally classified as entertainment, what do you do? You buy someone else’s. Twice.

Remember: Microsoft owns 27% of OpenAI — a $135 billion stake built on $13 billion in investment. Copilot was built on OpenAI’s models. Microsoft is OpenAI’s largest external shareholder. And yet — when it came time to build AI that actually does things instead of generating text — they didn’t go to OpenAI. They went to Anthropic.

In March 2026, Microsoft launched Copilot Cowork — their entry into agentic AI. Except Microsoft didn’t build the agent technology, and they didn’t use their $13 billion OpenAI investment either. They licensed it from Anthropic, a completely separate company, at roughly $500 million per year.

Microsoft’s own blog was remarkably candid: “Working closely with Anthropic, we have integrated the technology behind Claude Cowork into Microsoft 365 Copilot.” Read that again — the technology behind Claude Cowork. Not inspired by. Not similar to. The actual technology.

The price for this borrowed capability: Copilot Cowork lives in the new M365 E7 “Frontier Worker Suite” at $99 per user per month — a 65% jump from E5. Microsoft positions this as a discount from buying the components separately (E5 at $60 + Entra Suite at $12 + Copilot at $30 + Agent 365 at $15 = $117). A discount on Anthropic’s technology, wrapped in Microsoft chrome.

Follow the sequence: invest $13 billion in one AI company, build a chatbot on their tech, classify it as entertainment, watch adoption crater, then outsource the hard part to a different AI company and charge $99/month for the result. That’s not a product roadmap. That’s two white flags.

The Architecture Gap

The entertainment disclaimer and the Cowork outsourcing point to the same root cause: a generational architecture gap.

When Microsoft rushed to ship Copilot in 2023, they bolted their $13 billion OpenAI investment onto Office and Teams — a text-generation sidebar that doesn’t remember your last conversation, can’t coordinate across tools, and takes no action on your behalf. That’s what happens when an oil tanker tries to turn.

When they realized a chatbot wasn’t enough, they didn’t go back to OpenAI — their own investment. They went to Anthropic, a separate company, for agentic capabilities. Even Cowork is a graft — someone else’s agent model layered onto Microsoft’s existing stack, priced at a premium.

At amaiko, we shipped agentic AI inside Microsoft 365 in September 2025, six months before Cowork existed. Not a sidebar. A multi-agent system where a main agent delegates to specialists, maintains persistent memory of your team, and operates natively inside your M365 tenant. The capabilities Microsoft is now licensing from Anthropic at half a billion dollars a year — we built them from the ground up for Teams.

AI is hard, and anyone claiming otherwise is selling you something. We’re not going to pretend we’ve solved it. But there’s a clear difference between a vendor that classifies its own product as entertainment and one that builds agents designed to do real work.

And when your company’s “official AI” carries an entertainment disclaimer, don’t be surprised when your employees start bringing their own. The fine print just gave them permission.

Continue Reading