Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Daily Email Edition

Get MN Daily NEWS delivered to your inbox Monday through Friday!

SUBSCRIBE NOW

A look into what UMN knows about cheating with ChatGPT

After an academic year that saw the viral emergence of AI technologies, the University is working to keep professors informed and hold students accountable.
The+November+2022+release+of+the+chatbot+changed+how+universities+responded+to+cheating.
Image by Graphic by Ava Weinreis
The November 2022 release of the chatbot changed how universities responded to cheating.

The University of Minnesota has taken note of ChatGPT and how students are using it to cheat after the AI chatbot’s release in November 2022.

ChatGPT, an artificial intelligence (AI) chatbot capable of producing human-like text from prompts, sparked discussions at universities across the country. Some highlighted its potential to supplement learning, while others raised concerns over its use for cheating. 

As the University has grappled with best practices surrounding ChatGPT, the offices tasked with disciplining students have noticed an uptick in cases of academic dishonesty related to the emerging technology.

Provost Rachel Croson said in a campuswide email on April 27, “unauthorized use of online learning support and testing platforms is considered a breach of academic integrity and is subject to the same set of sanctions that would result from other forms of scholastic dishonesty.” 

The message told students to talk with their professors about using AI tools if they had not already clarified what is permitted in their class.

University records showed during the academic year that saw the emergence of ChatGPT, the University’s Twin Cities campus did not track data specific to the use of AI, only broader data on academic discipline.

Office for Community Standards Director Sharon Dzik said during the past academic year, all cases of online cheating were grouped together, including AI tools and online tutoring websites where students can find test answers, such as Chegg and Course Hero. 

Although it is still processing cases from the spring semester, online cheating made up a little over 20% of all cases of academic dishonesty in the 2022-23 academic year, according to the Office of Community Standards.

As more ChatGPT cases came, Dzik said her office saw the need for a new category to track AI-specific cases. Starting July 1, cases of academic dishonesty related to AI are tracked separately from other online cheating.

However, the University of Minnesota – Duluth did keep specific data on complaints of academic dishonesty related to AI during the past academic year. As of June 6, there had been 11 complaints, six of which saw students involved face disciplinary action, according to University records.

Four of the remaining cases faced disciplinary action, with one pending further research, according to Chris Kaberline, director of the Office of Student Conduct and Conflict Resolution at Duluth.

Kaberline said it was not until March when she started to see more cases related to the suspected use of ChatGPT.

“It’s just escalating,” Kaberline said. “I think a good number of the incidents that I’m receiving from faculty have to do with ChatGPT or some AI misuse.”

University records showed the Rochester, Morris and Crookston campuses reported zero complaints.

Navigating ChatGPT in the classroom

Dzik said cases of academic dishonesty with ChatGPT are often not hard for professors to prove thanks to tools like ZeroGPT, which can detect AI writing. 

“They’ll get a paper and the paper will look funny to them,” Dzik said. “They’ll be like, ‘This does not look like something this particular student would have written based on their past work or based on the way it’s written.’”

Dzik said she usually sees “pretty clear cut cases.” If a teacher prohibits the use of online resources in students’ work and shows evidence a student used AI, it is not hard to hold a student accountable.

During the spring semester, the University’s Center for Educational Innovation (CEI) created a guide giving context and considerations surrounding AI, such as how to detect AI writing and how to leverage the technology in the classroom.

Ilene Alexander, a teaching consultant for CEI, said the guide was created for instructors, faculty, teaching assistants and “anyone who collects assignments at the University.” 

Alexander said CEI has been approached by instructors and faculty curious about everything from how to prohibit the use of AI in their syllabus to how the technology can be used to assist students with learning disabilities.

She added she was even approached by a faculty member who wanted to learn how to use ChatGPT for their own productivity, such as making presentations more efficiently.

“They’re doing that because they want to have practical experience for guiding their own students,” Alexander said.

Alexander said while the requests CEI has been getting from faculty are a little different for each discipline there have been some commonalities. Available resources, how to offer guidance and support to students and how to make sure students are doing their own work without relying on plagiarism checkers are all requests CEI has gotten for the next school year.

“I think the common theme for the coming year is, “How do I do this work without being overly harsh, prescriptive and ruling it out entirely?’” Alexander said.

Leave a Comment
More to Discover

Accessibility Toolbar

Comments (0)

All The Minnesota Daily Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *