Colleges Struggle To Cope With Increasing Number of Students Using AI To Do Their Work for Them

With academic work increasingly outsourced to AI, concerns are emerging over how schools can crack down on ‘AI plagiarism,’ a new kind of academic ethics violation.

AP/Markus Schreiber
People are reflected in a window of a hotel at the Davos Promenade in Switzerland. AP/Markus Schreiber

More and more students are putting down their pencils, searching for ChatGPT or their favorite artificial intelligence chatbot on their internet browser, and prompting it to write thousands of words on their topic of choice. Then they copy and paste and hit “submit.” With academic work increasingly outsourced to AI, concerns are emerging over how schools can crack down on “AI plagiarism,” a new kind of academic ethics violation.

A study of tens of thousands of college and high school students across seven countries found a 76 percent rise in AI-generated material on their assignments between January of 2023 and January of 2024. According to the data, released last week by the platform Copyleaks, which identifies original versus artificially-created text, 30 percent of scanned assignments in America contained AI.

“High schools and colleges are increasingly aware of the challenges posed by AI-generated content and are implementing measures to address this issue,” the chief executive of Copyleaks, Alon Yamin, tells the Sun. “This includes educating students about the ethical use of AI tools, incorporating discussions about plagiarism detection software into academic integrity policies, and utilizing technology to identify and deter instances of AI plagiarism.”

The issue of plagiarism reached new attention after Harvard’s president, Claudine Gay, resigned in January following groundbreaking reports of ethical violations in her previous academic work. The same charge was soon after wielded at the wife of one of her biggest critics, Bill Ackman. With the stain of being a “plagiarist” more damning than ever before, perpetrators have an increasingly powerful weapon at their disposal — artificial intelligence. 

“As we all know, the general definition of plagiarism involves passing off someone else’s ideas, words, or work as one’s own without giving appropriate credit, and this can be argued that it applies to content created by AI as well,” Mr. Yamin says. AI models might be trained on copyrighted or licensed data, so people could publish protected content as their own. Using AI to generate content could constitute plagiarism under an organization’s policies — if they’ve been updated to accommodate such technology.

“If your essay replicates another person’s ideas but you don’t cite them, that might be plagiarism,” the legal director at the Electronic Frontier Foundation, Corynne McSherry, tells the Sun. “If your essay is substantially similar to another person’s work, that might be infringement.” Ms. McSherry says that the use of AI doesn’t change the analysis of plagiarism or infringement, though “a student using an AI tool might not realize her essay effectively plagiarized another work that was used as training data.”

Unlike copyright infringement, plagiarism is not illegal in America in most situations. To steal or pass off another’s ideas as one’s own is, rather, considered a violation of honor or ethics codes that can result in disciplinary action from a person’s school or workplace. 

Proponents of AI insist that its creations aren’t plagiarism. The platforms pull together information from a wide range of data sets and can therefore produce material that is highly specific and could arguably be considered original. Some consider this process different from the act of intentionally copying existing, human-created content.

Now that AI can write human-like text, though, the lines are increasingly blurry between what constitutes an original human thought and what AI can generate from its machine learning patterns. 

Concerns arise on the question of how schools can ensure academic integrity in an evolving academic landscape. Though the Copyleaks study shows a 51 percent decrease in plagiarism rates, AI-generated content can be hard to detect. 

And AI programs are highly popular. When the dominant AI chatbot, ChatGPT, was launched by the company OpenAI in November 2022, it attracted more than one million users in five days. According to the latest available data, ChatGPT currently has more than 180 million users. 

A host of plagiarism “checkers” or “detectors” have popped up recently to help teachers and professors decipher whether their students actually wrote their essays themselves. 

An example is an AI writing detection feature launched last year by the company Turnitin in partnership with academic institutions in response to the advances in ChatGPT’s technology. “Turnitin’s ability to point out writing that is likely generated by AI,” a spokeswoman for the company, Michelle Patrick, tells the Sun, “adds one more piece of information to what the educator can see about their students’ writing.”

With affirmative action outlawed, many colleges and universities are placing more weight on the essay portions of the college applications. AI chatbots might seem like an easy way for applicants to fill out responses to a growing number of essay prompts. 

Expecting this, several schools have restricted the use of AI on college essays and applications, Mr. Yamin says. They’ve added safeguards to detect if, for example, a student taking an assessment test that requires coding came to their answer using AI-generated code.

College consultants warn against using AI to offload labor and outsource creativity. “ChatGPT might be useful for research, brainstorming, or organizing very initial thoughts, but I would not use it to write college essays,” the co-founder of the educational consulting firm Entryway, Jennifer Bloom, tells the Sun. “What ChatGPT does not have is a ‘voice’ or rather, its voice is bland, readily identifiable and Wikipedia-esque.”

When the Washington Post presented ChatGPT-generated college admissions essays to a former Ivy League college admissions counselor, the writing was found to be “terrible.” Students should use the tool “sparingly,” Ms. Bloom says, “and realize that what it does not offer is exactly what college admissions officers want to hear: a clear, passionate, authentic voice and reflective details about a student’s life.”


The New York Sun

© 2024 The New York Sun Company, LLC. All rights reserved.

Use of this site constitutes acceptance of our Terms of Use and Privacy Policy. The material on this site is protected by copyright law and may not be reproduced, distributed, transmitted, cached or otherwise used.

The New York Sun

Sign in or  Create a free account

or
By continuing you agree to our Privacy Policy and Terms of Use