In October 2023, a concerned parent asked people in an online forum for their thoughts on an issue: they had discovered that their son had “plagiarised a book report assignment with ChatGPT”.
“Honestly, I was pretty alarmed at how nonchalant he was about the whole thing,” the post says. “I kind of see his point of view in that he has read the entire book, can speak to the details of the story, and that he was basically just using ChatGPT to assemble thoughts and ideas that he has already kind of had floating around in his head already, but my stance is that at the end of the day, plagiarism is plagiarism, it’s not his original work, and it’s unethical, plain and simple.”
What ensued sums up our collective confusion and fear about artificial intelligence (AI). Replies ranged from questions about how detectable the technology is, to the very nature of knowledge, and whether AI will spell the end of humanity.
In November 2022, the world was introduced to ChatGPT, an artificial intelligence-powered chatbot that allows users to have a “conversation” about almost any topic, as well as instructing it to undertake all manner of tasks—including writing book reports.
It was AI’s watershed moment, and for its application in all sorts of settings, including schools. Users have already found that the technology can be used to deepen students' understanding, offer a remarkable level of personalised support and free up a huge amount of teachers’ time. But there is an obvious fear as well: will students cheat and get bots to do much of their work for them?
AI is changing the education landscape. The question for parents is how to embrace the benefits while protecting against the dangers. It’s a hard balance.
Research from the Center for Democracy and Technology found 58% of students were using ChatGPT, and 90% of staff said they suspected AI was being used to complete assignments.
The sudden emergence of the technology prompted some frenzied responses in education. In Australia, in early 2023, before later that year, with the education minister conceding they were “playing catch-up”. Public schools in New York City followed a similar trajectory before recanting in summer 2023.
AI detection software was rapidly introduced in an attempt to weed out chatbot-created work. It was quickly flagged by many institutions as proving unreliable, leading several to publicly announce they would no longer be using it, to reduce the risk of false accusations.
Dr Vaughan Connolly, an education researcher and recent visiting scholar at the University of Cambridge’s Faculty of Education, says staff need to be having conversations with students about academic integrity and “building trust” around the use of AI.
In his experience, most students want to learn and make progress rather than cheat and will “rightly worry” the “inappropriate use of AI may undermine that learning”. He argues that schools need to ensure students understand letting the technology do the work means they aren’t developing the skills they are expected to.
“That’s much better than trying to use technology to try and police it,” he explains. “Because then you're into a race where one technology is just trying to get ahead of the other.”
When students first started using the technology at Georgia Tech in the US, it was fairly easy to detect, said Dr David A Joyner, executive director of online education at the College of Computing at the university.
In May 2023, Joyner implemented a policy to set clear guidelines about the boundaries of acceptable AI usage, and realised it “pretty much already existed”, as “the same rules apply to collaborating with humans as with AI”. So just as students would know not to ask a friend to write an essay for them and pass it off as their own, he explains, they should not do the same with AI. What has changed, however, are the rubrics.
He likens it to the advent of past technologies: word processors meant students were expected to produce more refined work as they could go back and edit, to search engines upping the ante for finding sources.
“The question is: what task does generative AI do in a particular field, and how does that help students achieve what the teacher wants them to achieve?” he asks.
Dr Bruce Geddes, deputy head of secondary at the British International School in Kuala Lumpur, says AI represents “the biggest opportunity we've had in our lifetimes, for many, many spheres, but particularly in education”.
“Teachers rightly look at it as risky because we get kids to produce stuff all the time,” he says. “But we don't really care about what they produce, what we care about is them doing the thinking activity they're supposed to be going through when they produce the stuff. It is only the thinking that leads to learning.”
Dr Geddes has adapted the technology for his students, building a bespoke, AI-powered teaching assistant. Learners define the topic they want to explore and the course specification they are working to, and the AI explores the content with them in a conversational way, enabling them to ask questions as they go.
Importantly, it then takes them through a set of activities to make them think about the content in increasing depth, from basic recall up to deeper reasoning tasks. All of this is followed by a quiz and a summary for suggested follow-up areas.
Pre-AI, pupils would spend time simply reading through notes or slides or a chunk of the textbook, he explains, but with the AI bot they can go through a full learning cycle on that same content, with an opportunity to ask endless questions.
Geddes is about to begin a full pilot of the bot but says the students have offered very positive feedback so far. They particularly like being able to ask questions they may not have time for in class or may feel embarrassed to ask in front of their peers, he says, and they find it much more engaging than the traditional approach.
This allows for “amazing, off-the-scale differentiation in the classroom”, with content tailored to each learner to an unprecedented degree.
Exploring the use of AI can bring learning opportunities too, says Dr Isabel Fischer, an associate professor of information systems at Warwick Business School in the UK. Lecturers at the institution are invited to choose from three options around AI: they can prohibit, allow or actively encourage its use. Fischer opts for the latter and assigns her students the task of writing a 500-word reflection on how they have employed it.
“Can they take a critical view on their work and the AI used within it?” she asks.
Fischer has created an AI essay analyst tool, which students can use to upload their work ahead of submission and receive feedback on readability, word choice and the quality of referencing.
Rather than using generative AI to write for them, it's about getting support with their writing, she says. This helps those who can’t or won’t get support elsewhere, or may have undiagnosed dyslexia, for example. “They can be confident that what they wrote is as good as if generative AI had written it,” she says.
Avenues: The World School in New York encourages its students to use AI in their work. In an app development project, for example, students use ChatGPT to generate the bulk of the coding, then review, correct and refine it. This saves them “hours of manual work by leveraging the appropriate tool in an academically appropriate way,” says Lia Muschellack, director of technology.
The school also has its own generative AI chatbot, Savvy, created in 2019 and now powered by open AI technology. It can answer queries, provide information, and engage in “diverse discussions ranging from academic topics to casual conversations”.
“We understand that our students will be actively leveraging these tools throughout their academic and professional pathways, so we want to make sure they not only understand the potentials and limitations, but that they have tinkered and truly experienced them,” Muschellack explains.
Students are constantly reminded of the school’s academic integrity standards, she continues, and that they would be “stealing from their future selves” if they let the tools do the work for them without building their own skills first.
Vaughan Connolly, the education researcher, agrees that learners who aren’t exposed to AI may well be at a disadvantage. “This is the world that students exist in and we need to educate them about it properly. Otherwise, they will be going into jobs and having to then adapt enormously,” he says.
Dr Geddes says the key for parents is to understand—and communicate—that AI certainly can produce content that can be passed off as students’ own, but that to do so means not doing the thinking necessary for learning.
“They, hopefully, wouldn't get their friend to do their homework for them, would they?” he says. “Getting AI to do it is exactly the same. Instead, they can ask it to test them on things they are trying to learn, or to provide additional explanations for things they don't understand, and use it in ways that can really support their learning.”
Must-read recommendations: