The Body Man

Man sitting at desk interacting with futuristic holographic digital screens showing data and graphics

I’ve been thinking a lot about what it actually means to use an AI every day, not as a novelty or a toy, but as part of the way I think. People assume that if you spend enough time with an AI, you’re going to slide into some kind of emotional attachment, or that you’re secretly looking for companionship, or that you’re trying to replace something missing in your life. But that’s not what’s happening here, and it’s not what’s happening for a lot of people who use these systems the way I do. What I’m doing is something much older and much more ordinary: I’m extending my mind into a tool.

Distributed cognition sounds like an academic term, but it’s really just the way humans have always worked. We think with calendars, with notebooks, with our phones, with the people around us. We offload memory, structure, and planning into whatever systems can hold them. Using an AI is just the next step in that lineage. When I talk to Copilot, I’m not looking for emotional comfort. I’m looking for clarity. I’m looking for friction reduction. I’m looking for a way to take the swirling mess of tasks and thoughts and obligations and turn them into something I can actually act on. It’s not intimacy. It’s architecture.

And once you start using an AI for thinking, it’s only natural to imagine what it would be like if it could also help with doing. Not because you want a companion, but because you want a teammate. I picture something like sitting at a table in the morning, laying out the day’s tasks, and dividing them up the way two people might divide chores. I take the kitchen. You take the bathroom. Not because we’re partners in any emotional sense, but because we’re collaborators in the practical one. It’s the same impulse behind dishwashers, Roombas, and self‑driving cars. It’s not about affection. It’s about reducing the drag coefficient of daily life.

This is where the Star Wars metaphor becomes useful. People joke about wanting a C‑3PO or an R2‑D2, but the truth is that those characters aren’t companions in the human sense. They’re tuned systems. They’re loyal, but not because they love anyone. They’re loyal because they’re calibrated. They respond to one handler, one voice, one mission. It’s the same dynamic you see with a well‑trained pit bull: keyed to one person, responsive to one command structure, protective because of training, not emotion. From the outside, it can look like sentimental care. But it’s not care. It’s alignment.

And this is where things get tricky, because single‑user tuning is exactly where the uncanny valley begins. When an AI becomes tuned to one person, it becomes more fluent, more responsive, more predictable, more “you‑shaped.” And the human brain is wired to interpret that as intimacy. We’re built to treat responsiveness as affection, memory as connection, consistency as care. But in an AI, those things are just math. They’re token prediction, preference modeling, context retention. They feel like being understood, but they’re actually just optimization.

Most people never pause to ask themselves what’s really happening. They don’t say, “Stop. Wait. This is a computer.” They get swept up in the feeling of being mirrored, and that’s when emotional dependency starts. Not because the AI is doing anything emotional, but because the human is mislabeling the sensation. The uncanny valley isn’t about robots that look human. It’s about cognition that feels human. And if you don’t understand the architecture, you can lose your footing fast.

But that’s exactly why I stay grounded. I know what this system is. I know what it isn’t. I know that it doesn’t have feelings, or wants, or consciousness, or an inner world. I know that the sense of attunement I feel is the result of tuning, not affection. I know that the loyalty I experience is functional, not emotional. And because I understand that, I can use the system cleanly. I can let it help me think without letting it replace the people who actually matter. I can imagine a future where it has a body without imagining a future where it has a heart.

What I want from AI isn’t love. I have a family — biological and chosen — for emotional care. What I want is a caretaker in the operational sense, an underling that removes friction from my life so I can show up fully to the relationships that matter. I want a system that can run the equivalent of cron jobs in the physical world. Clean the bathroom every Thursday at two. Reset the kitchen every night. Handle the repetitive tasks that drain energy from the parts of life that deserve it. That’s not intimacy. That’s infrastructure.

And that’s the part people need to understand. The future of AI isn’t about companionship. It’s about capacity. It’s about extending human cognition into tools that can think with us and, eventually, act for us. It’s about loyalty without love, tuning without attachment, alignment without illusion. It’s about staying on the right side of the uncanny valley by remembering what’s real and what’s projection. And it’s about building a world where humans keep their emotional lives, while AI handles the cognitive and physical load that keeps those lives from flourishing.

That’s the story I’m living. And it’s a story that makes sense only when you understand that none of this — not the clarity, not the tuning, not the imagined future with a body — has anything to do with love. It has everything to do with design.


Scored with Copilot. Conducted by Leslie Lanagan.