I know what you’re thinking. ChatGPT is good at writing. Therefore, why not use it for your college essays?
But this article is here to convince you not to do it – especially if you’re applying to selective colleges.
First things first, let’s admit that — yes — ChatGPT can write pretty good college admissions essays. In fact, it writes better than many (or even most) applicants:
There are two problems with using ChatGPT or another AI to write your essays.
Essays are crucially important — much more so than most students realize. That’s because selective colleges use essays to differentiate between tens of thousands of academically similar applicants. In fact, our analysis that came out of Harvard admissions litigation shows that strong essays improve admissions chances by 10x at Ivy and equivalent colleges.
Below, we’ll use an example ChatGPT-written essay to show why AI currently fails at this task. At the very end, we’ll show you a few techniques that you can use to have ChatGPT strengthen essays that you yourself write.
But instead of reading this article, we suggest working with a human writing coach. Prompt’s Writing Coaches have helped over 30,000 students achieve their college admissions goals. Relying on ChatGPT as your coach may be better than nothing (or than relying on Aunt Gertrude) – but it’s not going to help you differentiate your essays from the applicants you’re competing against.
Let’s start by looking at an essay ChatGPT wrote. It responded to this prompt that we provided:
Note on prompting ChatGPT — generally, the more detail you provide, the better ChatGPT will do.
Here’s what the paid version of ChatGPT (the GPT-4 model) gave us in return:
Wow. When you first look it over, it seems like a pretty good-sounding essay. The language is easy to follow, the flow is engaging.
But the essay fails on closer inspection. The content is poor - "These experiences have taught me a great deal about myself and others. In Mexico, I learned the immense power of patience and persistence." It sounds good, but doesn't give the depth college admission readers are looking for. They don’t want to hear that you learned something, they want you to demonstrate it. Plus, the voice is clearly AI (“the azure-blue day of my departure,” “my belief in the transformative power of education,” “igniting a flame that has grown into a full-blown passion”).
Let’s dig in to show exactly why and how this kind of AI-written essay will let you down.
(Note: if you want a few other examples of ChatGPT-written college essays, this article also has a few, with the same issues that we talk about here.)
Your voice is a huge part of your college essays. You’re telling your story related to your most compelling experiences that prove you’ll be successful in college and beyond. An admissions officer is using your essays to picture you as a member of the campus community and as an alumni. They can’t do that if they don’t get a sense of your personality and how you think.
ChatGPT’s voice is not yours. As we said, it’s a mix of all examples of “good” college essays from across the internet. You may think ChatGPT’s writing sounds good (not wrong). But it’s not you, and admissions officers will know that.
Admissions officers read lots of applications – often 50+ per day. While ChatGPT is new this admissions season, they will quickly learn to spot which essays are written by AI. ChatGPT’s voice is obvious. Some schools even will use AI checkers, such as CopyLeaks, which while not perfect, do a decent job of detecting ChatGPT.
You might think you can outsmart admissions officers — take something written by AI and modify it. Except it doesn’t work. It’s hard not to be influenced by things like an essay that “reads well” and looks grammatically correct and authoritative. Worse, your essay could also end up choppy; your voice interspersed with ChatGPT’s.
Keep in mind – admissions officers spend an average of 8 minutes per application. They don’t have the time to read and think super carefully about whether something is AI-written. It’s easier to dismiss an application and move on to the next if they suspect AI may have written it.
If we apply this to the essay ChatGPT wrote, note that most of the phrases we highlighted above as being weak, also read as having a particular “ChatGPT voice.” Let’s take that delightful phrase: “igniting a flame that has grown into a full-blown passion.”
This isn’t how normal people write. It especially isn’t how high school students write. But it is how ChatGPT writes — it’s the exact same voices it uses in all of its essays. Which makes it easy for admissions officers (and their AI detection tools) to identify.
ChatGPT is a long way from AGI – artificial general intelligence that can actually think, in the way we do as humans.
According to computer scientist Cal Newport, ChatGPT was essentially “trained on passages extracted from an immense corpus of sample text that includes much of the public Web.” It generates answers to queries using a “word-voting strategy,” basically, predicting the most common word to follow any particular phrase.
In other words, ChatGPT can only write essays based on what’s available online. Do you think most essays online tend to be excellent? Or would you guess that most aren’t that good? The correct answer is: most are terrible. Yet that’s what ChatGPT will reproduce.
Moreover, ChatGPT doesn’t understand what colleges are looking for in essays. It produces text that aligns with what it finds online — that is to say: with the myths about what colleges want. “Tell your story.” “Let us get to know you.”
In reality, selective colleges want you to show you’ll be successful in college and beyond. Specifically, they’re looking for experiences that exemplify one or more of the 5 Traits Colleges Look for in Applicants: Drive, Intellectual Curiosity, Initiative, Contribution, and Diversity of Experiences/Interests. They want to see you’ve done something that other applicants could not have done (or couldn’t have done as well).
Because of its pattern-matching, when ChatGPT writes an essay (or provides advice on a topic, or gives feedback on a draft), it focuses on the wrong things:
Now, let’s apply what we’ve learned here to our ChatGPT example essay. In broad strokes:
[1] These are things most students would do on a service trip. The essay has nothing about how the student went the extra mile. It doesn’t go into detail on impressive outcomes. It doesn’t show that this student is unique or exceptional.
This content just doesn’t cut it as compared to applicants at selective schools. In terms of the 5 traits, while there may be some drive/initiative here, the examples are weak.
At best, the student decides to teach some English as well as math to the Mexican student Pablo and to go on to tutor another student upon returning home. These are not … super impressive examples of going above and beyond.
The only way to improve the sense of this applicant’s drive and initiative would be to get more detail on the challenges involved in tutoring these two students and what the applicant did to overcome them. For example, did they seek out books for speed-teaching a child English? Did they consult with a great English-as-a-Second language teacher and use those lessons? Did they meet resistance from the program and overcome it somehow? Did they do this while simultaneously learning Spanish and overcoming jetlag?
We don’t know what the particulars were. And so none of it is in any way impressive.
[2] The essay has way too much descriptive language. Let’s look at the very first phrase: “the azure-blue day of my departure.” This may be nice, but it’s taking up space that is doing nothing for the admissions reader.
Azure-blue days have nothing to do with this student. The admissions officer is looking for a reason to move this application from the huge reject pile to the tiny accept one. The fact that this student once experienced good weather is not that reason.
Moreover, there was nothing about lovely departure-day weather in the prompt we fed ChatGPT. It made this fact up! Does it matter? Actually, yes. Your essay should be factual and authentic. This essay isn’t that.
[3] The essay’s plethora of buzzwords sound nice but add no value. Basically, the last four paragraphs of the essay are nothing but buzzwords. We singled out some examples earlier: “my belief in the transformative power of education,” “igniting a flame that has grown into a full-blown passion.”
Are these phrases going to get the admissions officer excited? Do they have a chance to move the essay from the reject pile to the admit pile? No!
Where is the proof that the student believes in the transformative power of education? There’s nothing to show that this applicant has done more than tutor two students and had an okay time doing it — barely meeting any obstacles along the way.
In addition, the essay doesn’t show us what actions the student has taken now that they believe in the transformative power of education. What effect is this “full-blow passion” for education having on the applicant’s life? There’s nothing here to convince us.
ChatGPT uses whatever content you give it to write essays — if you don’t give it enough, it fills the gaps for you. It has a few ways to do this (all bad):
Again, this is why the nice-sounding essays ChatGPT produces fail upon closer inspection. In our example essays, once the AI gets past the Loris story, it has multiple paragraphs of fluff where we really learn nothing more about the student.
In addition, ChatGPT doesn’t yet have word-count capabilities. In other words, you can’t get it to write, say, a great 650-word essay or keep to under 200 words. Combined with its fluff-generating bias, this is a recipe for a lot of weak content.
We’ve illustrated this concept below. In the ChatGPT-generated essay, all the parts an admission officer would consider fluff are in [Prompt blue] blue.
ChatGPT isn’t a total loss for college essays. There are ways it can be helpful. What you need are the right prompts.
We’ve spent many, many hours experimenting with ChatGPT and developing prompts that yield useful results. We’ll share more about the prompts in a future article. Here’s three ways we’ve found to make ChatGPT more helpful.
For more tips on how to use ChatGPT for your college essay, sign up for a free Prompt account. If you want more individualized support, we have 1-on-1 coaching packages with experts that will help you write an article that will stand out to admission officers.