One of the major fears with AI and the accessibility of tools like ChatGPT is that students will become too reliant on them. This was recently seen with not so surprising results in a study by Kosmyna et al. [1] The study discovered that students that used ChatGPT as the primary method for completing school work had "consistently underperformed at neural, linguistic, and behavioral levels" with regards to their peers on essays they wrote [1].
Unfortunately, this problem isn't limited to high schoolers as it even leaks into college students in both undergraduate and graduate courses. ChatGPT became public and popular during the last semester of my undergrad. By the time I was finishing my graduate schooling two years later it was the default tool used by every student in my classes.
Is the Horse in the Room?
In one of my classes, we were required to do several projects focused on distributed systems. Part of this was using the programming language Pony. Pony is a concurrent and type safe language meant to replace many of the concurrency issues we see in other programming languages. The professor had two main goals with us learning Pony: being forced to learn a new programming language that was significantly different from others, and to learn a programming language that is not well known by AI. See, Pony is a relatively new and obscure language, so our new AI tools aren't well trained on it.
One of the assignments in class, like most other college courses, was to summarize parts of the lecture in a handful of paragraphs. The professor was explicit in that we should be completing this ourselves and not relying on AI to summarize for us. What is the point in paying so much money for a graduate course if you cannot manage the time to do such a simple task anyways. It makes sense for full-time workers, but much less for full-time students.
Nevertheless, the professor caught on to the AI use in a pretty funny way. He told us at one lecture that he noticed the word "horse" in many of the submissions. In the attempt to sound smarter some LLM replaced most of the words with more complex words, including Pony with Horse. For any industrious student, a random set of horses in their writing would be obvious. If you've committed to using ChatGPT for a simple summarization assignment, you likely don't care about re-reading the few paragraphs it spat out.
What were they saving time for? Was the few minutes that were saved by using ChaGPT worth being caught?
How to use ChatGPT to Turn a Decent Paper into Slop
For a separate class we were tasked with writing a paper on a particular Natural Language Processing topic. My team was relatively random, two of us were strangers to the other three. We were given half the semester to complete this paper, which was focused on tabular data. The final month of the semester was writing and re-writing the paper with LaTeX.
Like most group projects we cut it into portions, of which I was given two of the sections. I find some pride in my writing and dislike using AI tools immediately. Tools like Grammarly are great for fixing grammar mistakes, word use, or sentence structure. What I dislike is cutting and pasting someones writing with AI generated text without asking the person who wrote it and without reading the text it returned.
I spent several hours writing and improving my small section, ensuring that I cited sources properly and conveyed the information effectively. As the paper was scientific, I aimed to have limited fluff and focus on only the important parts of our research. My section in particular was discussing other LLM models, a chunk of the history, and some of the results. These pieces do not need excessive verbiage or story telling, all that matters is what models we tested, why, and what the results were. Anything else is unnecessary and would harm the paper.
So the first pass of the paper went and the second pass with the teaching assistants comments arrived. None of my original writing remained on my sections. Instead, there were half-baked, overly repetitive, and poorly written paragraphs contorted from my original writing. It was obvious that the entire paper had been taken to either ChatGPT, Claude, or Gemini and asked to rewrite it "but make it sound more intelligent so that it can be included in a research journal."
Now I do not consider myself a Hemingway nor a Sagan, I do not have a great skill for writing in either fiction or non-fiction. But, I was hurt. I was hurt that someone else decided that my writing was not smart enough and needed to be made smarter, more academic sounding, more professional. If my writing was not going to be used anyways, why not have all of the paper be written by AI to begin with? We likely would have reached the same end state if we had written bullet points and provided the LLM with the data we collected. At least that way we would not have wasted time actually writing the paper ourselves.
In my own vindictive way I went through the paper and cut out much of the verbosity. Pointless sentences re-describing the previous statement or subjective comparisons were removed. I, alongside the professor and the teaching assistants, left multiple comments on the paper asking for sentences or even entire paragraphs to be cut. It was infuriating having to re-write my own sentences that were butchered by AI. We lost more time cleaning up the paper than we had saved in using some LLM to "improve" it. That seems to be the case for many of the ways people use AI, low effort and even lower reward.
Saving Time to Do What?
In each of these cases, someone decided that in order to save time and receive a better result they should use AI. In the first case it was to save time and at least get a grade for an assignment. In the second it was to "improve" a paper with minimal effort.
There are clear and useful cases for using AI; but, we need to ask ourselves: ("What precisely are we saving time to do?")https://theconvivialsociety.substack.com/p/waste-your-time-your-life-may-depend]. I believe that in most cases we abuse the use cases for tools like ChatGPT, creating little messes to be cleaned up later. We trick ourselves into thinking we are saving time to fit in other parts of our lives. When we force AI into places it does not belong and overuse it, we embarrass ourselves and waste our own time. I'll keep using ChatGPT and Claude to assist some of the more mundane parts of coding, but these experiences have turned me against letting these tools write on my behalf.
References (MLA)
- Kosmyna, Nataliya, et al. Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task. 2025, https://arxiv.org/abs/2506.08872.
- Sacasas, L. M. “Waste Your Time, Your Life May Depend on It.” Waste Your Time, Your Life May Depend On It, The Convivial Society, 12 May 2023, theconvivialsociety.substack.com/p/waste-your-time-your-life-may-depend.
