Admissions office sidesteps formal AI policy, refers applicants to podcast
As early action deadlines approach, some universities have begun to consider the implications of AI for college application essays, but Yale has refrained from outlining a formal policy.
Madelyn Kumar, Senior Photographer
As current high school seniors rush to assemble their college applications, they face a new, unprecedented dilemma: whether to seek assistance from language-generating artificial intelligence tools.
This application cycle marks the first time applicants have widespread access to language-generating chatbots like ChatGPT, which launched in November 2022.
Some colleges, like the Georgia Institute of Technology and the Sandra Day O’Connor College of Law at Arizona State University, have issued formal statements on applicants’ use of AI. Yale’s admissions office informally warned against AI-generated admissions essays in an Aug. 29 episode of its podcast, “Inside the Yale Admissions Office,” but has refrained from issuing an official policy statement.
“Given how much insight and explanation we provided in the podcast, I hope that Yale has now shared more information and advice about AI and college essays than just about any other college,” Jeremiah Quinlan, dean of undergraduate admissions and financial aid, wrote in an email to the News. “The fact that we have not yet tried to reduce the complexity of this topic to a short policy statement is a reflection of our desire to share more, not less. We may add written insights on this topic to our webpage in the future. But for now, we are directing students to our podcast.”
Quinlan is referring to a recent episode of the podcast — hosted by Mark Dunn ’07, senior associate director for outreach and recruitment at the Office of Undergraduate Admissions, and Hannah Mendlowitz ’12, associate director of admissions — titled “AI and College Essays: Wrong Question, Wrong Answer.” In the episode, Dunn and Mendlowitz raise concerns about ethicality, plagiarism and misrepresentation when it comes to using generative AI to help construct admissions essays.
When students submit written content to colleges, Dunn and Mendlowitz said in the episode, they must sign a statement affirming that all work submitted is the applicant’s own. Submitting work written by ChatGPT or another generative AI model violates this affirmation, they said.
“If you’re asking questions about artificial intelligence as you start approaching your college application, we think those are the wrong sorts of questions to be asking,” Dunn said in the podcast episode. “The right sorts of questions are about you, who you are and what you want to include in your application. Similarly, artificial intelligence is not going to be the answer to this bigger question of well, how do I improve my chances of admission?”
Dunn and Mendowlitz explained in the podcast episode that admissions essays are not used to assess a student’s writing ability nor can they help an otherwise unqualified student gain admission.
Rather, the purpose of college essays is to help admissions officers gain a more comprehensive understanding of an applicant, Dunn and Mendlowitz said. They added that students’ essays should resonate with and expand upon other parts of their application.
“Even if you think that AI is going to be a better composer of English language prose than you are, it is not going to be better than you are at speaking for you,” Dunn said in the episode. “And from our experience, we can tell you that speaking for you is much more important than whatever levels of polish are on top of your writing.”
Alfred Guy, director of undergraduate writing and assistant dean of academic affairs at the Poorvu Center for Teaching and Learning, has looked extensively at the ability of generative AI models like ChatGPT to imitate student writing.
What he found, he told the News, is that AI tools cannot yet produce analytical writing that is on par with the quality of human work, largely due to the fact that AI language models draw on past writing and are trained to present objective responses.
“It’s still the case that if there’s any reflective, synthesizing or judgment quality needed in the writing assignment, you can tell the difference between student writing and ChatGPT,” Guy said. “The relationship between having an idea and expressing it in a nuanced, slightly narrower way, is still something that a sophisticated language model can’t quite get.”
It is easy, according to Guy, to differentiate an 18-year-old’s writing from the writing of a machine, especially when that writing calls for more than a simple recollection of facts.
However, he added that it is possible to manipulate ChatGPT to produce decent analytical essays by working and reworking the prompt fed into an AI platform. This process of trial and error reflects a level of skill and judgment similar to what is portrayed through a well-constructed essay, Guy said.
“Students do a lot of things over the four years that they’re at Yale, and writing papers by themselves is only one of those things,” Guy told the News. “The truth is, if you can do six iterations of something on ChatGPT with the right parameters, you’re practically writing. You are demonstrating a skill that probably would make you an interesting person to have in a Yale class. It’s not the exact same talent as someone who writes something surprising and weird on their first shot. But you do have to be at least a good enough reader and have enough judgment to push ChatGPT where you want it to go.”
After speaking with the News, Guy experimented with using ChatGPT to produce an essay in response to one of Yale’s supplemental questions, which asks students to consider a time they discussed a topic with someone holding an opposing view. After a few iterations, he told the News via email, he produced a response that was “not terrible.”
Will Lehrhoff, a senior at Horace Greeley High School in Chappaqua, New York, is currently beginning to draft his college admissions essays. He spoke with the News about his experience grappling with the option to use AI tools as he completes his applications.
“Both teachers and college counselors have told us to stay away from using AI to help with admissions essays,” Lehrhoff said. “I think it’s fine for helping to come up with conceptual ideas, but actually using the language AI gives you is very risky. We don’t know what the colleges are thinking, and you could definitely get in very serious trouble.”
Guidance from other universities
Many colleges, like Yale, have not yet outlined a formal policy regarding the use of AI on admissions essays.
The Georgia Institute of Technology is one of just a few colleges to take a different approach. For this year, it added a new statement to its admissions website on how applicants should use AI tools on their applications.
“Tools like ChatGPT, Bard and other AI-based assistance programs are powerful and valuable tools,” part of the statement reads. “We believe there is a place for them in helping you generate ideas, but your ultimate submission should be your own. We think AI could be a helpful collaborator, particularly when you do not have access to other assistance to help you complete your application.”
The News spoke to Rick Clark, assistant vice provost and executive director of undergraduate admissions at Georgia Tech, about the rationale behind his office’s statement and his advice to other colleges on how to address the emerging AI question in college admissions.
Clark explained that without a surefire way to identify application essays written by AI chatbots — and without a uniform AI policy issued by its broader university — the Georgia Tech admissions office decided to construct its own statement on the use of AI by applicants.
According to Clark, the objective of the statement is to clarify that, while a student should never copy and paste directly from an AI platform, they should not be afraid of using tools like ChatGPT to help them brainstorm and edit, the same way they might use a college counselor or parent.
Clark pointed to a recent New York Times article that considers the ability of AI tools to democratize student access to writing help as they complete their college applications.
“I would absolutely not say that it completely levels the playing field, but I do think it is a step in the right direction in terms of a bit more equity, a bit more democratization in the college application process,” Clark said. “And I think that’s a good thing. And I hope that my colleagues at Yale and other admissions offices around the country would feel that way too”
Many prospective students have expressed relief at Georgia Tech’s decision to release an official statement, Clark told the News.
Clark also noted that students often perceive college admissions as a closed-off, secretive process, and he hopes to create a more open dialogue between students and admissions offices. Releasing an official statement allows students to base their decision to use AI on fact rather than speculation, he added.
“AI is a reality; it’s ubiquitous within high schools and within the lives of teenagers,” Clark told the News. “So we can’t hide under a rock and ignore that it exists. It’s incumbent upon us to give students some guidance, or to adopt a policy.”
He added that when admissions offices choose to remain silent, students must choose either to get “more stressed” or to “come up with their own narrative or speculation” about what a college does or does not want.
He also stressed the importance of continual revision to college policies on AI, as the technology is still in its early stages of widespread use and is evolving rapidly.
Clark urged admissions offices to consider the nuances of the situation rather than issuing a “knee-jerk” reaction and banning the use of AI in essays. He emphasized the importance of transparency between admissions offices and high school students.
Quinlan and Dunn maintained that a short policy statement fails to acknowledge the complexity of the issue at hand. They encouraged students who are curious about Yale’s stance on AI in admissions essays to listen to the podcast episode.
“It is not the case that ever using one of these tools, even when working on your college essay, is a flagrant violation of policy … We aren’t going to dig into the specifics of where that line is, but let’s just keep it simple: artificial intelligence should not write your college essay, and we think it’s not going to help you in the areas that most matter to making your essay stand out,” Dunn said on the podcast.
As of February 2023, ChatGPT had over 100 million users.