As a writer, people have been expressing concern for my business because of ChatGPT and other artificial intelligence (AI) applications that are so popular. That’s not the writing I do.
What ChatGPT Writes
I am far from being an expert at ChatGPT, AI, and many other things. Here’s what I’ve learned about ChatGPT and other programs like it:
- It answers questions based on information in its database.
- Not all questions have to be questions. For example, one person I know directed it for a gluten-free bread recipe. Regardless of the phrasing, the purpose is to provide information based on criteria set by the user. (By the way, the recipe didn’t work.)
- While responses may seem creative, they aren’t. Another example would be the person who asked for a book outline based on a presentation fed into the database. While my colleague was impressed with the result, the truth is that the program took the data given and combined it with the generally accepted rules for writing books (well documented on the internet) and put together an outline.
- The database is time limited. They stopped training the program at a certain point. After that point in time, the database is scant.
- If the database does not contain the answer, the program can make one up. (Not sure how it does this; I think it pulls the information defined as “close enough” together and mashes it together.)
Despite the potential of technology, we are not at a place where a computer is inventing new content. Computers are regurgitating already existing content in a format dictated by the user.
What I Write
Yes, I do a decent amount of information regurgitation myself, however, I add to it something that only a human brain can add – humanity.
Let’s say a client wants an article about a specific product they use. The first question I ask is, “why do you use it?” Does it work better? Cost less? Smell better? Have quality ingredients? Maybe it’s all of those or none of those. Maybe it’s something no one has talked about before that makes my client devoted to that product. That is not information in an AI database. That information only lives in the brain of my client – until they tell me, and I write about it.
I’ve heard stories about how ChatGPT can answer the same question appropriately for different audiences. The different responses vary word choice and level of detail depending on the audience defined by the user. Many people would call that voice.
Many people would be wrong.
While audience does influence voice, there’s more to it. There’s a difference between word choice and word preference. Personally, I abhor the word nonplussed, I only use it to say how much I hate it; in my writing, if I’m talking about someone being surprised, I say that they are surprised or one of several other excellent words.
Another aspect of voice is writing style. While there are many aspects of writing style that a person can program, every writer has a unique quirk or two that will gum up any program.
I put my clients’ voice into what I write for them.
What Neither of Us Write
Granted, ChatGPT can pull together something that might look like an original story, and yet, if you look more closely you are going to find pieces of this and pieces of that pulled from already existing stories. The combination of words is original, the data behind the words is not. Now, to be fair, many people would say that there isn’t any truly unique fiction anymore, and there is some validity to that statement. After all, look at all the popular fiction/movies based on one of Shakespeare’s plays.
I just don’t write fiction, that’s not how my creative works.
Which brings me to creativity. This is the true difference between humans and AI as it currently exists. Creativity requires imagination and that’s not how computers currently work.
When I trained people on software, I would occasionally get complaints about how come the program didn’t do this thing just so. Well, the software worked by programming the basic rules of the process, not all of the exceptions. What the computer does is process information really fast. If the information it has – the rules that it knows – do not fit the situation, it will break the rule and move on. It is then up to the human expert to address those places where exceptions to the rules are needed.
What the human expert has is the imagination (and training) to bend the rules in a way that will still work in the real world.
I think that my friend with the book outline was impressed with how quickly ChatGPT produced it. Again, that’s what computers do – process information more quickly than humans can.
What this all boils down to is that we have not yet managed to program a computer with imagination.
Lack of Imagination at Work
When I started to research this article, I found another one that provides and explains an excellent example of imagination: crochet patterns. The full article, This is what happens when ChatGPT tries to create crochet patterns, is well worth the read, I’m just going to pull some of the highlights.
First off, ChatGPT is a language-based program intended to be conversational. While patterns are instructions and written in language, “creating” a pattern – like a bread recipe – still requires some imagination. A program for creating patterns is a possibility that requires a different set of rules than the program for having a conversation.
Without the rules for making a crochet pattern for items not generally crocheted, the patterns ChatGPT came up with were <long pause> amusing. I think the “essence of Antarctica” is my favorite. Like I said, read the article – it has pictures.
The author, AJ Willingham, summed it up beautifully when she said, “human intelligence is fundamentally interdisciplinary. Language bleeds into sight, which tangles with memory or personality and so on. Artificial intelligence programs don’t really work that way. It may be able to do one thing really well – maybe even a few things. But once you push it past its assigned skill, it’s blobs all the way down.”
In other words, it lacks imagination.
– Lorrie Nicoles