Teachers are split on bringing ChatGPT into elementary, high schools
Educators must teach students 'when and where and how to use' new technologies, says math teacher
When high school teacher Jamie Mitchell asked his students about ChatGPT this week, he wasn't surprised when the vast majority said they already knew about or had used the artificial intelligence tool that seemingly everyone is talking about.
"It's everywhere. It's hard to ignore," noted the Burlington, Ont., math teacher. "Some kids are using it because … it's a fun laugh to get it to answer different questions. Some kids are using it for their schoolwork."
Much of the education sector's apprehension about students using ChatGPT to complete their assignments has thus far focused on the post-secondary level. OpenAI's incredibly accessible bot quickly answers user prompts with human-like responses — with varying degrees of sophistication — gleaned from vast information available online.
The use of the application by students in elementary and high school is ringing alarm bells for some, while others have embraced introducing the technology to their classrooms.
Arming students with information, tools
Mitchell, who's also program leader for mathematics, computer studies and I-STEM at his school, had ChatGPT on his radar early. Upon its public release last fall, he dived in to test the chatbot and flagged his findings in an email to colleagues that he cheekily used ChatGPT itself to write.
There can be misgivings raised about any educational tech tool — for instance, Mitchell has concerns about students using calculators for certain classes — but at this point, he believes ChatGPT to be an interesting innovation that has some limitations. And he's demonstrated that to his students.
Last term, Mitchell input calculus problems his students were solving into the AI bot, then asked the teens to review the answers that emerged.
"The tool made some great first steps in solving its equations, but after it got a few steps in, it started to do really wild, crazy, wrong things that the students picked up on," he said.
Though the tool may be good enough to fool people who don't understand calculus, Mitchell said because his students know the subject matter quite well, "it was almost impossible to fool them."
He said if he discovered a student passing off a ChatGPT-completed assignment as their own, a conversation would be in order.
He would review appropriate uses for the bot, note how and when it shouldn't be used and work with the student to develop a plan so the situation didn't happen again. But he doesn't believe banning ChatGPT would be effective.
"The push of technology to move forward is kind of relentless," Mitchell said. "If we're not arming students with the proper tools — to know when and where and how to use these tools — well, we're doing them a big disservice."
'Where's the motivation to learn?'
Mindy Bingham hadn't heard of ChatGPT when a friend broached the topic over lunch in January.
However, when the author and educational consultant subsequently found her nine-year-old granddaughter testing the bot while noodling away on a tablet, it encouraged her to investigate it herself.
"Once it gets out into the world for these young children, once they realize the power of it, it's going to reduce their motivation to learn," she said from Santa Barbara, Calif.
"Where's the motivation to learn when you know a machine can do it for you?"
Because ChatGPT was introduced relatively recently, many educators haven't yet heard of it, and Bingham worries tech-savvy youngsters who haven't developed a strong base of foundational knowledge and skills will use it to complete homework.
Coming up with an idea and being able to expand on it, problem-solving, analyzing content — "that's what we do in elementary school … whether it's reading, whether it's mathematics, whether it's writing," she explained.
"Critical thinking is one of the key issues here that artificial intelligence will take away."
Digital tools should be used to support learning, not supplant it, Bingham said. She's now tested ChatGPT thoroughly herself, and though she sees its value for adults, she's calling for caution when introducing it to classrooms.
She thinks a ban in elementary schools makes sense, but acknowledges the difficulty of detecting AI-generated assignments at this stage.
"Just because something is here today doesn't mean we adopt it."
'Not a be-all, end-all,' warns university student
Studying both computer science and journalism, Princeton University student Edward Tian has long been fascinated by artificial intelligence, specifically how it can be used for writing.
The 22-year-old Canadian considers the technology behind ChatGPT brilliant and exciting, but also ripe for abuse. So, over winter break, he created GPTZero, an AI-detector he hopes can provide some transparency.
More than 45,000 teachers from over 30 countries worldwide have signed up for updates about his detector, and he said he regularly hears educators saying it's "reassuring" that he and his teammates are developing it.
GPTZero works because humans write with "sudden bursts and variation of writing style," Tian explained, whereas machine writing tends to be fairly consistent. His tool essentially reads submitted texts and searches for that "burstiness" within them.
Also a big concern for Tian is the way AI-generated technology can be used to create misinformation — "news stories that might sound like real news, but are not," he says. That's why he's working on crafting a browser extension that can flag AI-created text used in that way online.
Though Tian agrees that younger students overusing ChatGPT could erode valuable core skills like writing and critical thinking, he's against bans, which he says students can easily bypass anyway. Instead, he encourages the responsible use of AI technology in education.
"One hundred per cent, we should explore and be exposed to the brilliant new technologies that are coming, but they're not a be-all, end-all," he warned fellow students.
"They're great at getting you started with ideas. They're not so good at doing the job and finishing the job. They're not so good at checking if the facts are right."
With files from Deana Sumanac-Johnson and Nazima Walji