There’s a moment in every technological shift when the abstraction finally becomes human, when the system stops feeling like a diagram and starts feeling like a room full of people making choices. For me, that moment arrived the day Caitlin Kalinowski resigned. I hadn’t known her name before that announcement. I wasn’t following her work or waiting for her to take a stand. But when she stepped forward and said, publicly and without theatrics, that she was leaving, something in me snapped into focus. It wasn’t about her personally; it was about what her departure revealed. Suddenly the thing I’d been trying to articulate for months had a face, a voice, a point of contact with reality. The adult had left the room.
I don’t mean “adult” in the emotional sense. I mean it in the systems sense — the person who understands the stakes, who sees the long view, who knows that powerful tools require stewardship, not spectacle. When someone like that walks away, it forces you to confront the possibility that the environment no longer supports responsible work. And that realization hit me harder than I expected. I wasn’t counting on her to fix anything. I wasn’t even aware she was there. But I had quietly assumed that somewhere inside the machine, there were people holding the line. Her resignation told me that assumption might have been wrong.
We’ve been using the wrong metaphors. We talk about AI as if it’s a character in a children’s story — a benevolent helper, a mischievous sprite, a digital Santa Claus who dispenses answers instead of toys. But AI is not a fictional being. It has no motives, no feelings, no inner life. It is not a creature with lore. It is a system, a tool, a cognitive instrument. Treating it like a character is the first ethical error, because once you imagine a tool as a person, you start behaving like a passive audience member instead of an active participant.
And then there’s the second ethical error, the one that keeps looping back in my mind. We’ve created a culture where adults — real adults, with mortgages and degrees and job titles — are using AI the way children use vending machines. Press button. Get thing. No process. No reflection. No ownership. It’s not that people are childish; it’s that the dominant metaphor encourages childish behavior. The vending‑machine stance rewards novelty, speed, and spectacle. It discourages metacognition. It erodes responsibility. It trains people to outsource thinking instead of extending it.
AI isn’t a vending machine — it’s a cognitive toolbox, and the difference between those metaphors is the difference between thinking and……….. not thinking.
That’s the line that keeps returning to me. Adults use AI as scaffolding, the way they use glasses or calendars or maps. They stay in the loop. They remain responsible for the outcome. They treat the tool as a way to enhance clarity, not replace it. They understand that distributed cognition is not magic — it’s infrastructure. It’s the difference between a pilot with instruments and a pilot pressing buttons because the lights are pretty.
This is why Caitlin’s departure hit me so hard. It wasn’t about her. It was about what her leaving signaled: that the people who understand the toolbox metaphor may be losing ground to the people who prefer the vending machine. That the adults in the room might be stepping out, one by one, because the room no longer supports the work they came to do. That the culture around AI is drifting toward the nursery instead of the workshop.
And that’s the real ethical question, the one we keep avoiding because it’s uncomfortable. What kind of users do we want to be? A species that treats tools like characters, that treats cognition like a chore, that treats thinking as optional. Or a species that uses its tools to extend its mind, that remains responsible for its own reasoning, that understands the stakes of building systems that shape human thought.
Caitlin didn’t answer that question. She didn’t need to. Her resignation simply made the stakes visible. It put a human face on the truth I’d been trying to express: if the adults leave the room, the children will run it. And children should never be in charge of the tools that determine how a society thinks. The future of cognition depends on which metaphor we choose, and metaphors — unlike machines — are entirely in our hands.
Scored with Copilot. Conducted by Leslie Lanagan.

