Why Using An AI Assistant For Essay Writing Is A Bad Idea

OpenAI’s ChatGPT has been a hot topic for a while. It’s not something completely new though. There have been other AI tools and there will be better ones. Yet, ChatGPT is one of the best AI assistants available today to most people. Company managers use it to set team goals, web developers utilize it to find better ways of problem-solving while coding, and students who lack time to write long reads also use it.

However, there is one thing any AI tool user should be aware of. While it’s okay to use AI for those who already have enough knowledge and a job, for students it might not really be an option. So, read on to know all the reasons why using AI tools to complete essays and other papers is a bad idea.

Personal Style

A bot doesn’t know your writing style, but your teacher probably does. The same concerns typical mistakes of yours. For instance, if you keep getting C’s for awkward wording and generic writing, an AI assistant might help you with both problems and make your essay just perfect. However, imagine your teacher’s response – they would probably uncover your plan.

Using an essay writing service, on the other hand, is a much better idea. A professional writer can mimic your style, and after a round of editing, you can make the essay sound 100% like you

Sure, with time you can definitely make your statements more specific and clear and get rid of odd words. But for a human, it is a step-by-step process, you can’t do it overnight. Therefore, if you submit a flawless essay without any errors all of a sudden, there is a chance that you could be accused of plagiarism.

Robot Kid reading his book

Image from Unsplash

Social Context

Yes, Interstellar had these robots who could joke, but let’s go back to reality. Machines can’t distinguish such complex things as sarcasm, for instance. They can make up jokes but only based on the information they were fed. They may not understand your inside jokes known only to your friends or be able to create original ones.

So, imagine your teacher taking a creative and funny approach to formulating instructions that you will later feed to an AI assistant. Will the latter get your teacher’s sense of humor? What if it’s so boring that the tool will consider it seriously? Are there chances that the bot knows about the situation that happened during a lecture or class and that your teacher decided to refer to?

Besides, the terminology might seem the easiest part for AI because it comprises set word combinations that are hard to confuse with something else. Yet, jargonisms included in instructions and specific to this or that industry is not always stable.

If you compare how teachers use the same words depending on what institution, country (sometimes, even city) they are from, you may see that one and the same word means a number of things a machine can’t possibly know about. Those are just local notions the internet isn’t aware of, so AI interprets them the wrong way in most cases. This proves one more time that social context is important.

Specific Instructions

Not every teacher treats writing guidelines lightheartedly. Some are willing to compose a 10-page document with strict requirements for a 5-page essay. Even a student experienced in writing may fail to meet all those requirements, leave alone a machine.

Moreover, the problem with AI is that its creativity is limited when it comes to words. In the end, it might meet all those criteria like using specific words and structures, but the text will be too rigid and well… robotic.

It’s Still Just a Machine

Writing an essay is not nearly the same as compiling a list of the best tourist destinations or imitating a casual conversation. Even if you use an AI assistant quite regularly and it has never let you down, sooner or later Murphy’s Seventh Law will manifest itself.

All of us have probably used shortcuts to save time and effort. And there are high chances that those shortcuts had some risks we’ve sworn to consider and be careful about. And probably most of us have at some point, forgotten about those risks and used the shortcut carelessly, which led to a mishap.

Why is this relevant to our topic? Imagine you have used ChatGPT or a similar tool, like, 5 times to write an essay without applying much effort. You’ve been careful enough to check and fix all kinds of things like:

  • logic in the statements
  • referencing
  • false plagiarism
  • off-topic or generic sentences
  • odd words your teacher always complains about, and so on.

Sometimes, or even every time, you didn’t have to make serious corrections, so you’ve got used to this situation: the tool has got your back. At some point, probably when you will have to complete one of the most crucial assignments, you’ll have no time at all, not even to check the text composed by AI as you usually do, and leave the thing to itself. And Murphy’s Seventh Law states the following: “Left to themselves, things tend to go from bad to worse.”

In the end, you might face a situation where Murphy’s First Law comes into action too: “Anything that can go wrong will go wrong.” The tool will crash because of the overload, some prankers will hack the database, a typo in guidelines will make the bot distort the topic completely, etc. So, is it really worth the risks?


Image from Unsplash

Weakened Skills

Remember that essay writing is also an exercise for your mind. When composing a text, you develop your logical, creative, and writing skills. Yet, relying on AI assistants too often or all the time may make your writing skills deteriorate so much that you won’t be able to write anything when you will really need to do it, let alone something more serious than an essay.


AI assistants have limitations as only humans can strive for perfection, and perfectionism, combined with laziness, leads to innovative and efficient solutions. Machines may provide a functional solution, but not necessarily the optimal one.

For instance, when it comes to programming, ChatGPT may omit the best coding practices because those are not clearly documented anywhere. When it comes to translation, it gives weird word combinations, so who says there won’t be some of them in your essay? Yes, readers are often able to guess what is meant, but in the case of academic assignments, it’s your grades that are at stake.

One thing is clear for sure: AI can be of great help if not abused. When you don’t really have time for the very writing but you know the plot of the book from the essay topic and are able to critically approach the provided text, then it will definitely help you. However, in case you feed it with every kind of task without critically analyzing the outcomes or use it just to have fun, it will let you down at some point.