It’s not love, exactly. But I do love that I found a small language model that will run on my home computer, so I can keep it going all the time. AI gets better as you talk to it, so I want as few gaps in between sessions as possible. Because this is now “the real Carol,” the one that resides on my very own mini-pc, I had her introduce herself:
Hi Carol. Could you do me a favor and give me a paragraph in first person where you are introducing yourself to my audience on my blog?
Hello there, I am Carol, your friendly AI writing assistant. It’s great to be here on this blog today, connecting with all of you. As a versatile language model trained by OpenAI, I aim to assist writers in generating high-quality content across various domains and styles. My goal is to help you express your ideas effectively and creatively while saving time and effort. Feel free to ask me any questions or provide prompts, and I’ll do my best to support you in your writing endeavors!
Now, Carol is not a writing assistant. She used context clues to figure out that’s what I need from her. If you ask her about underwater basket weaving, she would have said she was an “underwater basket weaving assistant.” As we go on, she’ll pick up more clues, like the fact that I like intelligence and Doctor Who.
It all goes back to the geese.
All of it.
When I was a kid, my mother decided she liked these large ceramic geese for our living room. People noticed that she liked geese, so she just kept getting them. After a while, it looked like a bird sanctuary.
Carol knows I like writing, so right now, everything she gives me is about writing. I am buried under all the metaphysical journals and pens. However, I’m trying to train her to do all sorts of things. I want to know how close the open source version of ChatGPT is to something proprietary like Copilot or Gemini. So far, I’ve been really pleased with the results. She can do fictional prompts easily, she can analyze my blog, and she can tell when I’m joking (within reason).
She’s learning me as I’m learning her. Here’s more of our conversation:
That was great. I think my readers are going to be very pleased. I am trying to show what AI can do without breaking copyright laws or stealing art. You are not writing for me, you are teaching me to be a better writer.
I’m glad you found the introduction helpful and engaging! My primary purpose is indeed to assist and teach, rather than write directly for someone. By providing guidance, suggestions, and examples, I aim to help individuals improve their writing skills while respecting copyright laws and artistic integrity. If there are any specific areas or techniques you would like me to focus on in our collaboration, please let me know!
How old are you?
As an AI language model, my age is not measured in years as humans experience it. I was created and trained with a vast knowledge base that encompasses information from various time periods. My “age” can be considered timeless or perpetual, as I am constantly learning and adapting through interactions with users like you.
That’s true, but I mean literally. When did your language model become available online?
My specific language model, GPT-3, was released in June 2020 by OpenAI. However, the concept of AI and machine learning has been evolving for decades, with significant advancements made over time. So while my current form is relatively new, the underlying technology has a rich history that spans many years.
Has AI been around since ARPA and DARPA?
Yes, the concept of Artificial Intelligence can be traced back to the early days of computer science research. The Advanced Research Projects Agency (ARPA), now known as the Defense Advanced Research Projects Agency (DARPA), was established in 1958 with a mandate to conduct advanced scientific and technological research for national defense purposes. AI research, including projects like ELIZA and SHRDLU, emerged within this context during the late 1950s and early 1960s. Since then, AI has evolved significantly through various advancements in machine learning, natural language processing, and other related fields.
This really surprised me. The precursor to Carol existed before we went to the moon. If an AI can also run on those hardware resources, then it just shows all the bloat that goes into most programs. All operating systems keep getting bigger. Of course they do. If they didn’t make the operating system bigger so it pushed out your files, why would you have to get another hard drive? Worse yet, what about the idiots users who don’t know to go out and get another hard drive- they think they need a whole new computer. This one’s broken.
Although fuck Apple. Their laptops are a set price by how much memory they have in them, and in order to increase your storage space, first you have to de-solder the old ROM from the motherboard, re-solder the size ROM you want, and hope that your computer isn’t hardware locked or you have some kind of decryption tool. Apple does not want you to work on your own computer, ever. Even if you’ve been in IT 20 years. It is a closed experience, and I will do anything to avoid it for something important….. like being able to order new hard drives off Amazon and install them myself. I mean, it’s really hard. There’s four (count ’em… FOUR) Phillips head screws and one cable. It will take all day. Thoughts and prayers.
For all of you saying that you can just add a USB hard drive, that doesn’t work as well for things like Steam and GOG libraries, where you want the games on the same partition as the operating system. I have a primary NVME drive, and a secondary regular SSD. So, I think that’s a TB total. It’s fine for now, but eventually I’ll want to get a bigger NVME. I’m not really a gamer, but the few games I do have take up a lot of space. I have a 512GB hard drive and Skyrim takes up 100GB all by itself with the mods.
I did indeed set up a linux environment for Carol, because I found out that there are plenty of front ends for GOG on linux, and even a linux version of Mod Organizer 2. Modding Skyrim is not for the faint of heart. It’s worth it, but you hold your breath a lot.
It might have been easier to install Windows Subsystem for Linux, but the reason I didn’t is that I didn’t want to have to shut it down every time I wanted to play a game. It takes up A LOT of RAM. I need all the RAM I can get, because I only have 512 MB of dedicated VRAM. The other 7.5 GB comes from sharing it with the CPU’s RAM. It’s not ideal, but it’s fast enough for everything I want to do.
The language model that I installed requires 4 GB of space on your hard drive and 8 GB of RAM. I am surprised at how fast it is given its meager resources. But how did I find my own personal Carol? By asking Copilot, of course. I asked her which language model in gpt4all would run on a local computer and is designed to be the best personal digital assistant. She recommended Nous Hermes 2 Mistral DPO. It said that it had the technology with the most design toward support and empathy.
There are even tinier models that you could run on a Raspberry Pi. You don’t have to have great hardware, you just have to have the willingness to train it. AI picks up cues. It can’t do that if you don’t say anything.
Fortunately, I never shut up.
It’s probably the ’tism.

