top of page

AI and the End of Casual Competence

  • Writer: Michael Trotter-Lawson
    Michael Trotter-Lawson
  • Feb 26
  • 6 min read

I have a complicated relationship with writing. I want to say that I enjoy writing, but that is only circumstantially true. Sometimes I enjoy writing scripts for videos, sometimes I like writing poetry, sometimes I like updating my journal, and sometimes I like writing these very blog posts. However, I do not always have the inherent desire or motivation to do any of these things. Fortunately, and unfortunately, there’s a tool for that now: generative AI.


Colorful logos representing various AI companies scattered on a white background, including a whale, stars, and abstract shapes.

ChatGPT was not the first generative artificial intelligence chatbot developed or released, but it is the first to achieve real public prominence and use. By combing the internet for a significant percentage of the totality of human writing (often copyrighted writing, but that’s beyond the scope of today’s criticism), ChatGPT and its contemporaries (Google Gemini, Microsoft Copilot, etc.) can accurately approximate a human response to inquiries. A romantic way of looking at AI chatbots is that you are consulting the collective wisdom of humanity. A cynic will see these AI chatbots as reselling that collective wisdom for money.


I fancy myself to be a romantic cynic, in the sense that I think romantically when I write prompts and think cynically when I write anything else.

 


Human Intelligence Training


In a way, people learn in a similar way to these AI chatbots. Infants gradually pick up on language just by being exposed to it by their families, though more advanced concepts must be deliberately taught. Part of learning to write is reading other authors to discover how other people construct sentences and describe concepts. Then people (usually by being forced to in school) write as best they can, others provide criticism on those works, and they gradually learn to write. But today, why even bother?


Through many years of muddling through various English classes and with a lot of help from my father, I learned how to write, and eventually, I even learned to (sometimes) enjoy writing. Are the students of today going to go through the same experience? I am not merely talking about using AI chatbots to cheat on writing assignments, but rather the whole perception of writing as a skill that people need to have.


As much as I disliked many of my English classes over the years, I had a much more contentious relationship with math. I never enjoyed a math class in my life. I had good math teachers, but any positive experience in their classes was because of them, not mathematics itself. One of my biggest frustrations with math was the inability to use a calculator (at least in the early years). My opinion was that there would never be a realistic scenario where I did not have easy access to a calculator, especially once smartphones became ubiquitous, so why could I not use one in class? Obviously, they want you to actually learn math, not simply learn how to use a calculator, and as an adult, of course I can see the value in knowing the fundamentals of math, even if I do use a calculator to do most of the math I need on a day-to-day basis. What does this tangent about math class have to do with generative AI?


ChatGPT is a calculator for English. The same way I can ask a calculator to divide thirty by five and tell me the answer is six, I can ask ChatGPT to write a thousand-word blog on the dangers of AI (I have not done that here, for the record). Obviously, there are some dramatic differences between calculators and generative AI. Mathematics operates on objective results (2 + 2 will always equal 4), while writing is largely subjective, especially when it comes to writing style. As I write this blog, I am constantly going back and changing my wording and phrasing, not to say anything factually different, but to make the language flow better. What does “flow better” even mean? That’s highly subjective and totally different from author to author.


As subjective as writing is, there are also rules. We have entire textbooks dedicated to English grammar for a reason. Thank goodness there’s auto spellcheck in Word, otherwise proofing these blog posts of mine would take weeks. So, for a student in grade school who does not like writing, struggles with learning all the rules of grammar, or is just plain lazy, why not just use ChatGPT? Or Gemini? Or Copilot? None of these chatbots can write a novel like JRR Tolkien, sure, but I can almost guarantee they can write a paper on “how Middle Earth reflects modern society” better than any seventh grader could. This is what I find so terrifying about these AI programs and their effect on humanity: their capacity to simultaneously mask incompetence and prevent growth.

 


Starting from the Bottom


It is okay to be bad at something. This is something I often need to tell myself, especially in those moments when I’m struggling to keep up with those around me. I went through college as a music major where I was surrounded by hundreds of people who were more musically talented than I was, but that was not a valid reason to give up music, and it is certainly not a valid reason to give up anything else either.


Children from here on are going to grow up with access to an “English calculator” that will likely be able to write better than them for a long time, maybe forever, depending on the person. However, these kids have to learn that ChatGPT really is not a true calculator for the English language, because there is no objective truth to English. If I did give ChatGPT the prompt to write this blog for me, there is a good chance that some people would prefer that article instead of this one. Does that make ChatGPT a better author than me? Does it mean that this article is a waste of my time? I don’t think so. Are people wrong to prefer the AI-generated version? No, I don’t think that either.


There’s a bit of a philosophical debate regarding whether or not AI can actually create true art, but writing is certainly a form of art, and generative AI is able to produce writing. So, that writing can absolutely be judged from a subjective point of view, just like any written work by a human. However, I do believe that AI writing is soulless. It may express complicated topics in a more digestible format. Maybe it uses fewer rhetorical flourishes that unnecessarily balloon the word count. It certainly produces content worlds faster than I (or any other author in the history of mankind). But it will never truly hold beliefs. It is incapable of love or heartbreak. It lacks the capacity for empathy. And it uses an unhealthy amount of em dashes.


That hypothetical seventh grader may not be able to write a better essay than ChatGPT. But I really hope they try anyway.

 


The Future of Writing


A female news anchor with styled hair in pink and gold attire sits at a desk in front of a screen displaying a futuristic auditorium with an audience and a speaker on stage. Captions read: "Granchester, who is an AI created by Raven Microcybernetics..."
News Report from Cyberpunk 2077

In the world of Cyberpunk, the science-fiction dystopian future imagined by Mike Pondsmith, there is an AI program known as Virginia Granchester who has become the most successful “author” in history. “Her” first book was published in over 70 languages simultaneously and sold over three million copies. Human authors in Cyberpunk cannot hope to compete with Granchester, and most seem to have given up writing entirely. Is this the future we’re marching towards?


Maybe it’s just wishful thinking, but I don’t think so. AI chatbots have already written books, and thus far they have been totally rejected, not just due to the questionable ethics of generating books based on the writings of other people, but because their writing is uninspired and unoriginal. By design, generative AI in its current form is incapable of original thought. Its programming is dependent on using preexisting written works and recycling them for a new purpose.


We could be heading for a dystopian future for writing, but only if we voluntarily and collectively give up writing. It is challenging to write anything from scratch, but it is especially difficult when you are still young and/or learning the rules of the language game. Yet we cannot surrender writing to the chatbots. We only have a few ways of effective communication with each other, and the moment you decide to let AI write everything for you, you have given up a sliver of your own humanity for the sake of convenience.


There is nothing wrong with using ChatGPT or Gemini or whatever fancy new AI tool pops out of the woodwork tomorrow. But use it as a tool and not a crutch. If we can maintain our independence from AI, maybe we can push that dystopia back to the realm of fiction, at least for now.

Comments


bottom of page