Calgary

Should students use apps to write assignments? Attitudes on U of C campus surveyed

A research group at the University of Calgary wants to know what professors and students think about using AI in higher education.

University of Calgary researcher says it's already too late to ban AI text generators

A hand holds a smart phone on which the AI text generator answers a prompt. The text is too small to read.
ChatGPT is shown answering a prompt on a device. The program was released to the public late last year. (Peter Morgan/The Associated Press)

Can professors tell the difference between AI-generated writing and words written by a student? And could technologies such as ChatGPT be used ethically for course work? Or would using them be akin to plagiarism? 

These kinds of questions will soon be put to University of Calgary professors and their students as part of a research project led by Sarah Elaine Eaton, an associate professor at the Werklund School of Education. 

According to Eaton, who specializes in researching academic misconduct, there has been a "knee-jerk" reaction by some educators, who have suggested banning students from using AI programs in their work. 

But to Eaton, it's already too late to ban students from using the technology, especially college students. 

"I think these apps are already everywhere," she said this week on the Calgary Eyeopener. "We can't stop the train, so to speak. So the big question for us in higher ed isn't 'How do we ban this technology?' Instead, it's 'How do we use it ethically and how do we teach our students to use it ethically?'" 


Eaton's research group is composed of researchers from several departments, such as English and engineering, and includes an official with the university's student accessibility services. 

"We all have really different starting points," Eaton said. "In education, I'm concerned about how this is used in the classroom. My colleague in English is wondering how this is used in creative writing, and my colleague in engineering is really interested in machine learning — he says bring it on." 

The results of their study could help shape academic misconduct policies at universities across Canada, if not the world, especially as the public increasingly comes to realize how powerful the AI technology is, and how rapidly it's improving.  

A woman wearing a blue shirt, glasses and multiple necklaces smiles before a grey backdrop.
Sarah Elaine Eaton is leading the research group looking into the ethics surrounding the use of AI in higher education. (Clayton McGillivray)

Did AI eat your homework? 

Late last year, the San Francisco-based company OpenAI released its ChatGPT program to the public. The cutting-edge chatbot left users — from journalists to technologists, educators to businesspeople — astonished and perplexed. 

Given virtually any prompt, the large language model could quickly produce a textual response with surprising eloquence and accuracy. 

At first glance, the technology posed a significant problem for any class that assesses writing, particularly for students in elementary and secondary education. 

However, Jason Wiens, a professor in the English department at the University of Calgary, who is part of the AI research group, said ChatGPT isn't yet producing university-level writing. 

"We're in sort of uncharted territory here when we consider the implications of this technology," he said. "It's in some ways comparable to when the calculator came along and the implications that had for mathematics." 

Starting this month, the research group will begin collecting data, surveying students and educators on the University of Calgary campus. 

The questions will probe how well students and professors understand AI technologies and measure the extent to which each group can differentiate between AI-generated text and writing created by a human. 

Additionally, and perhaps most importantly, the surveys will ask students and professors their thoughts on ethical questions surrounding AI use in the classroom and on assignments. 

The research group hopes to have results by the end of the term.

Learning from AI 

Educators are already thinking of creative ways to use AI for teaching purposes. 

For instance, Wiens suggested that in an assignment focused on editing text, students could be tasked to give an AI a prompt and then edit the resulting draft, revising and improving the language. 

He noted that while AI is getting better at generating text, similar technology could also be used in the future to critique or even grade it. 

"In an absurd scenario, you could have a student submitting work generated by a machine and then have it evaluated or assessed by a machine," Wiens said, adding this isn't the academic future he necessarily hopes to see.

A student wearing a cream-colored jacket and a black backpack walks along a snowy trail with bare trees and light posts.
The research group plans to begin surveying University of Calgary professors and students this semester. They hope to have results by the end of the term. (The Canadian Press)

To Eaton, the technology is something educators will need to grapple with, because these programs will be part of the landscape in which students will build their careers. 

"I think it's kind of incumbent on us to learn how to teach students ethically, because chances are that they are going to have access to these apps when they go to work," she said. 

On the question of misconduct, Eaton said she's unaware of any AI provisions on the books in any Canadian academic misconduct policy.

"This tech wasn't even around a year ago," she said. "So, for people to automatically say it's misconduct or automatically say it's plagiarism, I urge caution on that, because I'm not convinced it is."

ABOUT THE AUTHOR

Jonathon Sharp is a digital journalist with CBC Calgary. He previously worked for CBS News in the United States. You can reach him at jonathon.sharp@cbc.ca.

With files from Nathan M. Godfrey, Loren McGinnis