How AI Is Quietly Transforming College Admissions
The (largely) unspoken arms race between students writing applications and schools reading them
Rick Clark is bullish about AI’s potential for transforming college admissions—for the better.
“It's sort of paradoxical,” Clark, the executive director of strategic student access at Georgia Tech, said in a 2024 interview on the podcast Rethinking College Admissions. “Artificial intelligence, and the improvement of technology implementation, actually can make all of this way more human.”
In the world of college admissions, Georgia Tech’s Clark is a rare species: willing, even eager, to talk about how his school is thinking about AI as a tool for aiding applicant evaluations and acceptances. Because, though schools mostly aren’t talking about it publicly, it appears that most admissions offices are using AI, somehow, some way, to ease what can be a crushing process. Northeastern University in Boston, for instance, received a record number of applications last year—105,000-plus. UCLA, the most-applied-to school in the country, logged more than 145,000 first-year applications for fall, 2025.
At Georgia Tech, a highly competitive public university in Atlanta, the numbers are smaller but still, in their own way, staggering: 67,000 applications last year, of which less than 13% got admitted, for a first-year class of about 4,000 spots. The school’s regular admissions staff is about 35 people, and they bring in an extra 60 as seasonal readers, Clark told a Boston NPR station last September. That works out to more than 700 applications per person, assuming each one of those people is taking an equal share of the pile.
If AI can relieve some part of this workload, bring it on, Clark said. AI can help out by “letting machines do what machines do better anyway, faster and more efficiently and accurately, [and] let humans do what they do better, which is interaction and relationship and connection,” he said.
There’s one widely quoted survey, from the practically prehistoric days of 2023, when half of all admissions departments (both high school and higher ed) told Intelligent, an online education magazine aimed at prospective college applicants, that they used AI in the admissions process. More than 8 in 10 said they planned to use AI in the 2024 admissions cycle. And the majority of schools surveyed who use or plan to use AI said AI would have the final word on whether a student is admitted or not.
“Artificial intelligence, and the improvement of technology implementation, actually can make all of this way more human.” — Rick Clark, Georgia Tech
Many schools are mum about their own use of AI on their admissions sites. They do, however, have something to say about applicants using AI to write their essays.
Here's one, fairly typical warning, from Washington University (“WashU”) in St. Louis, Mo.: “We discourage you from using AI tools like ChatGPT as the main source of your essay’s content. Whether you’re sharing your achievements, activities, or skills, AI tools should not be the primary author.”
Or this, from Brown University in Providence, R.I.: “The use of artificial intelligence by an applicant is not permitted under any circumstances in conjunction with application content.”
The trouble is, it’s hard for even the most determined admissions officer to know for sure who (or what) wrote an essay. AI detectors can’t be trusted, and colleges know that. A June, 2025 investigation from the newssite CalMatters looked into plagiarism-detection software called Turnitin, which the publication reports is licensed to more than 16,000 institutions worldwide that cumulatively enroll over 71 million students. In California, for instance, Cal State University campuses have spent a combined $6 million since 2019 on Turnitin, the reporters wrote. This tool is considered so valuable that its eponymous, Oakland-based company was acquired by Advanced Publications, the media conglomerate that owns Conde Nast, in a deal worth nearly $1.75 billion in 2019.
And yet, CalMatters reported, “the technology offers only a shadow of accurate detection: It highlights any matching text, whether properly cited or not; it flags everything that mirrors AI’s writing style, whether a student used AI inappropriately or not.”
“The use of artificial intelligence by an applicant is not permitted under any circumstances in conjunction with application content.” — Brown University website.
This software and its bugs are hardly an anomaly. A 2024 study in the journal Nature warns of the “pressing need” for tools that can flag when text has been written by or revised with AI. However, such tools appear “to be beyond the reach of current technology and detection methods,” the authors wrote.
All of which is to say: the danger in using AI to write a college essay is likely not a direct one. It’s hard to imagine admissions officers feeling so certain an essay was written by a bot that they’re willing to disqualify the entire application as a result.
The issue is more one of opportunity lost. I don’t think having AI write your essay for you is advisable. You risk ending up with an average, predictable piece of writing that leaves you personally no more enlightened than before about your own goals, and how to express your own voice. College admissions offices say your essay is your chance to distinguish yourself. You hardly distinguish yourself when AI writes your essay.
But can AI help? WashU hints at this (AI shouldn’t be “the main source of your essay’s content.”). Georgia Tech, on its website, says the quiet part out loud:
“If you choose to utilize AI-based assistance while working on your writing submissions for Georgia Tech, we encourage you to take the same approach you would when collaborating with people. Use it to brainstorm, edit, and refine your ideas. AI can also be a useful tool as you consider how to construct your resume in the Activities portion of the Common Application. We think AI could be a helpful collaborator, particularly when you do not have access to other assistance to help you complete your application.”
In other words, AI can level the playing field, especially for students who can’t afford essay coaches like me. Clark expounded upon this in the podcast interview:
“In talking to students over the years, there's what I call the tyranny of the blinking cursor… The blank page is terrifying. It’s why we encourage students to use ChatGPT, or whatever it may be. It's like, throw an idea out there and let's see what it gives you. And it's not going to be great, but it's going to be something, it's something to get the ball rolling.”
As for what’s happening on the other side, when the application lands at the school—for the most part, that’s an unknown. The University of California is one of the few that states its process is all-human, all the time: “Every application is read and evaluated by application readers — it’s a human process, not an algorithm… UC doesn’t use artificial intelligence in its application review process.”
I did a brief, very informal survey of other schools and could not find comparable information about AI application review at Harvard, Princeton, Stanford, WashU or the University of Michigan.
Virginia Tech, though, announced it would be wading into the AI waters this year. Like many other major universities, the public university in Blacksburg, Va. has seen its applicant pool rise year after year, hitting a new high last year of 57,622 first-year applications – a 10 percent rise from the year before. In the past, every application has been read by two human reviewers. This coming application season, each application will be read by one human reviewer, and one AI reviewer, using a large language model trained and tested by Virginia Tech researchers. The AI reader would not make final admissions decisions, just be used as a back-up to confirm the human reader’s score, said Juan Espinoza, vice provost for enrollment management.
“If the human and AI scores differ by more than two points, a second human reviewer is brought in to ensure consistency and fairness,” Espinoza said.
Most schools have internal scoring for applicants, formulas they use to give students a numeric score for comparison of hard data like GPAs and test scores, Clark said in the NPR interview. If that scoring could be handed over to AI, it could free up admissions officers and readers to more holistically consider applications, as well as to connect with students and their families on a more personal level, he said.
The University of North Carolina, Chapel Hill, says it’s already doing something similar. “UNC uses AI programs to provide data points about students’ common application essay and their school transcripts,” the school writes on its website. “Data points include writing style and grammar and the rigor of students’ coursework. This allows our admissions team to focus on the content of a student’s essay, the student’s grades, and the extent that they’ve challenged themselves in the classroom with a strong curriculum.”
On the same NPR program as Clark, a senior editor at the Chronicle of Higher Education sounded a cautious note. It’s one thing for a student to challenge an admissions decision made by a human, she said. It’s another thing entirely to challenge one made by a bot. “AI can't be held accountable,” Taylor Swaak said. “So on top of being a black box, there's really no accountability, where if something goes wrong or a student feels like they have been wronged, an AI tool reviewed their application and made a decision based on that.”
If a student wonders how a school is using AI in the admissions process, they have the right to ask, Clark told the NPR interviewer.
“AI can't be held accountable. So on top of being a black box, there's really no accountability, where if something goes wrong or a student feels like they have been wronged, an AI tool reviewed their application and made a decision based on that.” —Taylor Swaak, senior editor, The Chronicle of Higher Education
The schools, for that matter, have the right to ask applicants if they used a bot to write their essays.
But the answers may not be straightforward. Is AI making a problematic decision if it disqualifies candidates because their grades are so low that an admissions officer would have tossed their application on the first round? Is a student cheating with AI if they use it to brainstorm an essay? Or even to write a first draft that they then edit and rewrite and edit again to make it their own?
Anyway.
Who will dare to ask? And who will decide to tell the truth?
This raises some very interesting questions about AI and college applications — worthwhile reading for anyone with skin in the game — college-bound students, parents of college-bound students, high school teachers and guidance counselors, university admissions staffers (who, no doubt, are already thinking about these things).