A story has been appearing in the regional press about a cross-bench peer, John Pakington (Baron Hampton) who is, unusually, also a working teacher. He is concerned that students are using Artificial Intelligence systems to produce essays, technical designs and even works of art and then passing them off as their own. He says:
There is a lot of anecdotal evidence, at the moment, that suggests that students are using AI for everything from essays and poetry to university applications and, rather more surprisingly, in the visual arts subjects. Just before Christmas, one of my product design A-level students came up to me and showed me some designs he’d done.
He’d taken a cardboard model, photographed it, put it into a free piece of software, put in three different parameters and had received, within minutes, 20 high-resolution designs, all original, that were degree level – they weren’t A-level, they were degree level. At the moment, it’s about plagiarism and it’s about fighting the software – I would like to ask when the Government is planning to meet with education professionals and the exam boards to actually work out (how) to design a new curriculum that embraces the new opportunity rather than fighting it.
Tim Clement-Jones is our Digital spokesperson in the House of Lords and he agreed with his fellow peer:
This question clearly concerns a very powerful new generative probabilistic type of artificial intelligence that we ought to encourage in terms of creativity, but not in terms of cheating or deception.
Some 30 years ago I was studying AI as part of my Masters degree. Many of the same tropes were circulating then as now: “AI will make people lazy”, “Many jobs will be lost to machines” – similar sentiments have been expressed whenever there is a substantial shift in technology, from Jacquard looms to automated car production. But this time there is the added fear that AI will “take over” and we will become the redundant playthings of super machines. In practice, many of the techniques that I was looking at then are now embedded in our technologies; they improve productively and are hugely beneficial to society. They support and amplify our activities rather than replace them, although, as this evidence suggests, they can also present new challenges.
Andy Boddington had some fun with the latest AI chatbot, ChatGPT, and generated a passable short essay and some rather dubious poetry. When I say “passable” I mean that it is almost impossible to tell that it has been generated by software and not by a real person. It is also possible that ChatGPT could pass the Turing Test and win the Loebner Prize.
Of course, the issue of plagiarism has dogged educational assessment for many years. Academics routinely use plagiarism detection systems for essays, and I have a couple of examples from my own professional experience.