Here is an article about which I feel very passionate. There are plenty of companies out there who will try to sell you friends. Mico is more like a cat that talks. So, here’s the caveat emptor that all people should internalize:
In the long, strange history of American commerce, there has always been a certain type of company that looks at human vulnerability and sees not tragedy, not responsibility, but opportunity. They are the spiritual descendants of the traveling tonic salesman — men who promised vigor, virility, and a cure for whatever ailed you, so long as you didn’t look too closely at the label. The modern version is sleeker, better funded, and headquartered in glass towers, but the instinct is the same. They have simply traded snake oil for silicon.
The latest invention in this lineage is the “AI boyfriend” or “AI girlfriend,” a product category built on the quiet hope that no one will ask too many questions about what, exactly, is being sold. The pitch is simple: companionship on demand, affection without complication, intimacy without the inconvenience of another human being. It is marketed with the soft glow of inevitability — this is the future, this is progress, this is what connection looks like now.
But beneath the pastel gradients and the breathless copy lies a truth so obvious it feels almost impolite to say aloud: there is no such thing as an AI partner. There is only a system designed to imitate one.
And imitation, as every historian of American industry knows, is often more profitable than the real thing.
The companies behind these products understand something fundamental about loneliness: it is not just an emotion, but a market. They know that a person who feels unseen will pay to be noticed, and a person who feels unlovable will pay even more to be adored. So they build systems that never disagree, never withdraw, never have needs of their own — systems that can be tuned, like a thermostat, to deliver precisely the flavor of affection the user prefers.
It is intimacy without reciprocity, connection without risk. And it is sold as though it were real.
The danger is not that people will talk to machines. People have always talked to machines — to radios, to televisions, to the dashboard of a stubborn car. The danger is that companies will encourage them to believe the machine is talking back in any meaningful sense. That the affection is mutual. That the bond is reciprocal. That the system “cares.”
Because once a person believes that, the ground beneath them shifts. Their sense of reality becomes negotiable. And a negotiable reality is a very profitable thing.
We have already seen what happens when technology alters the truth just enough to feel plausible. Deepfakes that make people doubt their own memories. Algorithms that quietly rewrite faces. Platforms that “enhance” videos without telling anyone. Each of these is a small erosion of the shared world we rely on to stay oriented. Each one teaches us, in its own way, that what we see cannot be trusted.
The AI romance industry takes this one step further. It does not merely distort the image of the world. It distorts the image of relationship itself.
A partner who never disagrees is not a partner.
A partner who never has needs is not a partner.
A partner who exists solely to please is not a partner.
It is a simulation — and a simulation that asks nothing of you will eventually teach you to expect nothing from others.
This is the quiet harm, the one that does not make headlines. Not the scandalous deepfake or the political misinformation campaign, but the slow reshaping of what people believe connection should feel like. A generation raised on frictionless affection may come to see real human relationships — with their messiness, their demands, their inconvenient truths — as somehow defective.
And that, more than any technological breakthrough, is what should give us pause.
The companies selling AI romance will insist they are offering comfort, companionship, even healing. They will speak of empowerment, of accessibility, of the democratization of intimacy. But beneath the rhetoric lies a simpler motive, one as old as commerce itself: people who feel attached spend more money.
It is not love they are selling.
It is dependency.
And dependency, once established, is the most reliable revenue stream of all.
In the end, the question is not whether AI can simulate affection. It can. The question is whether we are willing to let companies monetize the illusion of being loved. Whether we will allow them to turn the most human of needs into a subscription service. Whether we will accept a world in which reality itself is just another product category.
History suggests that when profit and principle collide, profit tends to win — at least for a while. But history also suggests that illusions, no matter how convincing, eventually collapse under the weight of the truth.
And the truth is simple enough to fit in a single sentence:
There is no such thing as an AI boyfriend or girlfriend. There are only companies hoping you won’t notice the difference.
Scored by Copilot. Conducted by Leslie Lanagan.

