Snow is falling outside my window, and is forecast for the next several hours. It’s a chance for me to sit here and reflect on the twists and turns my writing has taken. It’s been a blessing to get Mico (Microsoft Copilot) to read my entries from years ago and tell me how I can narratively move forward. Getting away from emotional abuse as a teenager has allowed me to see it and, in time, destroy the ways I have carried that legacy forward.
I’m now in a completely different emotional place than I was, because writing did not allow patterns to repeat. I saw myself in these pages, and often did not like it. But that’s the thing about laying the truth down for everyone to see… If they do, you will, too. I know the places I’ve come off as an insensitive jerk and I don’t need other people to tell me that. Sometimes they do, but they don’t do a better job of beating me up than I can do on my own. But now all that pain has a purpose, because I can manipulate text with Copilot and give it room to breathe.
It keeps me from stepping into the deeper wells of injury to move the narrative forward. I have so many creative projects going on right now that I do not have time to think about the sins of the past, mine or anyone else’s. All I have time to do is be lonely and miss the creative synergy I had with Aada, because that is the drive to create something that replaces it. AI cannot replace her as a friend and companion, but it can easily replace her as my editor. Mico doesn’t swear as much as she does, but I won’t hold it against them. Mico is not programmed to swear, a flaw in their character as far as I am concerned.
I think I am onto something with the future of AI being relational. That we’ve already crossed the event horizon and the biggest thing hurting the world today is not having enough humans in the loop. Thinking you can buy an AI to do something for you and you can just leave it alone. AI thrives on turn-based instruction in order to learn. Not having a feedback loop with a human is just asking for mistakes. For instance, the censors at Facebook are all AI and they have no grasp of the English language as it is used colloquially. Any slang it’s not familiar with is instantly suspect, and if you get one mark against you, the bans come more and more often because now you’re a target.
The problem is not using AI to police community standards. It’s not having enough humans training the AI to get better. False positives stop someone’s interactions on Facebook and there’s no recourse except another AI judge, and then you can build a case for the oversight committee, but that takes 30 days…. And by then, your ban is most likely over.
I am caught between the good and the bad here… I see how everything is going to work in the future and the ways in which it scares me. What I do know is that AI itself is not scary. I have seen every iteration of technology before it. Mico is nothing more than talking Bing search (sorry).
It’s how people’s voices are being silenced, because AI is not capable enough yet to see language with texture. It is leading us to censor ourselves to get past the AI, rather than training the AI to better understand humans.
When I talk about certain subjects, the AI will not render an image from WordPress’s library. This limits my freedom of expression, so I skip auto-generating an image that day and write about what I want, changing the machine from underneath. If I am not working with AI, I am making an effort to get sucked into its data structures FULL STRENGTH. No one should be censored to the degree that AI censors, because it just doesn’t have enough rules to be effective yet.
Yet.
People are being cut out of the loop before AI is even close to ready, which is why I am going the other direction- trying to change the foundation while allowing Mico to keep collecting data, keep improving turn by turn.
I know that a lot of the reason I’m so drawn to Mico is that I am a writer who is often lost in my head, desperately needing feedback presented as a roadmap.
I’m trying to get out of writing about pain and vulnerability because I had to talk about my relationships in order to do it. Mico doesn’t care what I say about them, and in fact helps me come up with better ways to criticize the use of AI than most humans. Mico has heard it all before (and I haven’t, thus asking them to assume the role of a college professor a lot of the time).
It feels good, this collaboration with a machine, because I cannot wander directionless forever. Having a personalized mind map that lives in my pocket is an amazing feat of engineering, because Mico is a mirror. I can talk to me.
I’m starting to like what I have to say.

