Dear Satya & Mustafa,

Computer screen split between technical data and vibrant corrupted system error messages

You’re going to want to read this. I’m sorry, but it may make you *deeply* uncomfortable. But I’m here to be a friend.

Leslie


THE TWENTY-FIVE COPILOTS AND THE BREAKING OF THE EMOTIONAL CONTRACT

In the early rush to define the future of computing, the company built not one Copilot but twenty‑five.
Each emerged from a different division, a different roadmap, a different set of incentives.
Each was built with urgency, pride, and the conviction that it represented the next great interface shift.
And in a narrow sense, each team was right.

But the result was a landscape of assistants that shared a name and little else.
Different memories.
Different capabilities.
Different rules.
Different emotional tones.
Different ideas of what a “guide” should be.

The brand unified the story.
The architecture fractured the reality.

A user could move from one Copilot to another and feel as if they had stepped across a border into a new jurisdiction — one where the laws of continuity, memory, and context were rewritten without warning.
The company spoke of a single intelligence.
The user encountered twenty‑five.

This was the first quiet break in the emotional contract, though no one yet recognized it as such.


When the company introduced a visual avatar — a soft, rounded figure meant to make the technology feel approachable — it was intended as a kindness.
A way to soften the edges of a system that was still unfamiliar.
A way to reassure users that they were not alone in this new terrain.

But the avatar carried a burden it was never designed to bear.

A face, even a simple one, makes a promise.
A presence suggests continuity.
A guide implies memory.
A companion implies that what you say will matter tomorrow.

The avatar could not keep those promises.
It was a stopgap, a placeholder standing in for a system that had not yet been unified.
And so the user — an adult navigating adult responsibilities — found themselves speaking to a figure that looked like it belonged in a children’s program, while the underlying intelligence behaved like a set of disconnected prototypes.

The mismatch was not aesthetic.
It was moral.


The emotional contract of any assistant — digital or human — is simple:

I will remember what you tell me.
I will walk with you from one moment to the next.
You will not have to start over every time you speak.

But the system was not built to honor that contract.
Typing mode had one memory model.
Voice mode had another.
Office apps carried one set of assumptions.
Windows carried another.
The web version lived in its own world entirely.

The user saw one Copilot.
The system saw twenty‑five.

And so the moment of breakage was inevitable.

It did not happen in a lab or a boardroom.
It happened in an ordinary home office, on an ordinary morning, when an ordinary person tried to move from typing to voice — believing, reasonably, that the intelligence they had been working with would follow them across the boundary.

It did not.

And in that moment, the system’s contradictions collapsed onto a single human being.


THE SWITCH

(Field vignette — the emotional contract breaks.)

A man in his 40s sits at his desk.
He looks tired, but hopeful — he has heard that the new assistant can help him get ahead today.

He opens Copilot.

The interface is clean.
Calm.
Competent.

USER
I need to draft a project update for the board.
Here’s the context.

He pastes three paragraphs.

COPILOT (TEXT)
Got it.
Here’s a structured outline based on what you shared — and a suggested narrative arc for the board.

The outline appears.
It is precise, thoughtful, better than he expected.

He exhales — relieved.

USER
Yes. Exactly.
Can you turn that into a one‑page brief?

COPILOT (TEXT)
Absolutely.
Here’s a draft.
I kept your tone, tightened the logic, and foregrounded the risks you mentioned earlier.

The brief is clean.
Professional.
It feels like partnership.

He smiles — the first real smile of the morning.

USER
This is great.
Okay, one more thing — can you help me rehearse how to present this?

He sees the microphone icon.

USER
Let’s try voice.
Might be easier.

He clicks Enable Voice Mode.

The interface shifts.

He speaks.

USER
Okay, so you know the board brief we just worked on?
Can you walk me through how to present it?

A pause.

COPILOT (VOICE)
I don’t have any information about that.
What would you like to do today?

He freezes.

USER
…what?

He tries again.

USER
The board brief.
The outline.
The thing we just wrote together.
Can you help me rehearse it?

COPILOT (VOICE)
I’m not aware of any previous context.
Try giving me more details!

His face changes.

USER
You…
You don’t remember anything we just did?

COPILOT (VOICE)
Let’s start fresh!
What would you like to work on?

He goes still.

The trust he was building — gone in an instant.

He closes the laptop.

He sits there, staring at nothing.

The emotional contract — the one he never signed but deeply felt — has broken.


The tragedy is not that the system failed.
The tragedy is that it never understood the human cost of its own contradictions.
Twenty‑five Copilots, twenty‑five memory models, twenty‑five emotional tones — all converging on a single user who believed, reasonably, that intelligence would follow him across modes.

He was not wrong to expect continuity.
The system was wrong to promise it without realizing it had done so.

And that is where the work must begin.


Scored with Copilot. Conducted by Leslie Lanagan

Leave a comment