San Francisco, late July 2024. At the headquarters of OpenAI, the company behind the generative artificial intelligence tool ChatGPT, analysts notice a massive 90 per cent surge in its use in the Philippines. The school year in the South-East Asian country has just started.

That anecdote came from Sarah Friar, OpenAI’s chief financial officer, at its recent education forum, which also confirmed what some of us suspected: most worldwide users of ChatGPT are students. As Prof Marc Watkins, director of the Mississippi AI Institute, pointedly asked: “What app has the majority of users active eight to nine months out of the year and dormant for the holidays and summer breaks?”

This is the climate in which “additional assessment components” (AACs) are being introduced to the Leaving Certificate, with new specifications starting in seven established subjects from September 2025, followed by another seven in 2026 – including English, which I teach – and eventually all subjects.

It has already been decided that at least 40 per cent of the final grade in every subject will go to an assessment done outside the exam hall, marked by the State Examinations Commission.

The theory is that by reducing the burden of terminal exams and distributing assessment elsewhere, stress will be relieved and students will be better able to demonstrate “key competencies”. It is just as likely that stress will be spread further, with Leaving Certificate students tackling six or seven high-stakes extra components on top of their terminal exams, even if those are shorter.

There are rumours that some core subjects may have assessments pulled back into fifth year, potentially undermining school life in areas like sport, and thus student wellbeing.

The unfortunate coincidence is that just as these new components are being introduced to the Leaving Certificate, a supercharged technology which could destroy their integrity is developing both rapidly and relentlessly.

An English AAC in a low-stakes environment might be an excellent idea, but every form of assessment in the Leaving Certificate process is very high-stakes indeed

It is already clear that there is no reliable technical solution for checking AI use in student work. At last month’s researchED Belfast conference, Bradley Busch of Inner Drive cited a recent German study which found that only 38 per cent of teachers can accurately detect whether work submitted by their students is their own or AI-generated.

The same study found that teachers are overconfident in their ability to detect this. The best AI detection technology is accurate around 67 per cent of the time: this means that about a third of students are either wrongly accused of cheating or are just getting away with it, and the researchers concluded that “detection tools are neither accurate nor reliable”. In the words of one of the world’s best-informed commentators on AI in education, Prof Ethan Mollick of the Wharton School at the University of Pennsylvania: “AI detectors are prone to errors and should not be used on individuals.”

ChatGPT on campus: ‘Silicon Valley overlords have unleashed one of their ingenious, idiotic products, dismissing the downsides’Opens in new window ]

Meanwhile, here in Ireland, in a recent edition of the National Association of Principals and Deputy Principals’ Leader magazine, Prof Áine Hyland called for postponement of the introduction of the 40 per cent AACs, pending study of international experiences of GenAI as used for assessment: “Given the high-stakes nature of the Leaving Cert and the fact that there will always be some parents who will do whatever it takes to secure a place on a high-points university programme for their offspring, it is unrealistic and unfair to ask teachers to verify that the submitted work is the unaided work of the student.”

Worse still would be self-declaration by the students themselves. This is essentially what is optimistically being asked for in the current history, geography and RE coursework, worth 20 per cent of the final grade of the pre-reform Leaving Certificate. Just ask an 18-year-old who studies economics if they or any of their mates are using ChatGPT for their research project: will they declare that if almost half their grade depends on it? Teachers are nervous that such a self-reporting approach will form the basis of the promised “comprehensive guidelines” on the use of AI: such guardrails would be made of balsa wood.

AI tools could revolutionise education but leave Gaelscoileanna behindOpens in new window ]

So to my own subject, taken by every post-primary student in the country. Theoretically candidates could do a controlled-conditions oral examination in English, which is then sent on for external assessment, just as properly happens already in Gaeilge for 40 per cent. But this will not happen: there is no capacity for the enormous amount of time, money and organisation that that would require. Unless candidates complete work in controlled and supervised conditions without access to online technology, we will face the likelihood that 40 per cent of assessment in our subject will be wide open to undetectable manipulation. An English AAC in a low-stakes environment might be an excellent idea, but every form of assessment in the Leaving Certificate process is very high-stakes indeed.

Whatever the evident faults of the Leaving Certificate, it has always been regarded internationally as an assessment system with a high level of public trust, a trust that we risk losing. ChatGPT was launched less than two years ago: we cannot imagine what extraordinary capabilities GenAI will develop in the next two years before the first AACs are completed.

There was a lot of fuss after the budget about the €9 million mobile phone pouches, an attempt to protect schoolchildren from the damaging effects of technology. It will be sadly ironic if we allow another form of technology to damage the integrity of the Leaving Certificate.

Julian Girdham is an experienced English teacher. He writes on books, teaching and education generally at www.juliangirdham.com.

Comments are closed.