Artificial intelligence
The rise of AI tools such as ChatGPT and Google Bard has sparked concerns about cheating among students in the education sector. Wikimedia Commons

More than half of undergraduates say they use artificial intelligence (AI) programmes to help with their essays, a new study shows.

A survey of more than 1,000 UK undergraduates, conducted by the Higher Education Policy Institute (Hepi), found 53 per cent were using AI to generate material for work they would be marked on.

Additionally, one in four are using applications such as Google Bard or ChatGPT to suggest topics and one in eight are using them to create content.

However, just five per cent admitted to copying and pasting unedited AI-generated text into their assessments.

This is the first UK-wide study to explore students' use of generative AI since ChatGPT thrust the technology into the mainstream discourse, back in November 2022.

The findings show the use of generative AI has become normalised amongst students in higher education.

But does the data demonstrate a need for clear and updated AI policies from the institutions themselves?

Dr Andres Guadamuz, a reader in intellectual property law at the University of Sussex, said it was no surprise that more students were adopting AI and suggested institutions needed to be explicit in discussing how best to use it as a study tool.

"I've implemented a policy of having mature conversations with students about generative AI. They share with me how they utilise it," Guadamuz said.

"My primary concern is the significant number of students who are unaware of the potential for 'hallucinations' and inaccuracies in AI. I believe it is our responsibility as educators to address this issue directly."

The Education Endowment Foundation (EEF) is currently signing up secondary schools for a new research project into the use of AI to generate lesson plans and teaching materials as well as exams and model answers.

According to the EEF's proposal, the use of AI might instead help cut the workload burden on teachers, as well as improve the quality of their teaching.

Gillian Keegan, the education secretary, has suggested AI could take on the "heavy lifting" of marking and planning for teachers.

Last year, Prime Minister Rishi Sunak claimed the technology could be used to provide "personalised learning" to children at school.

Education was one of the public services he was most excited about AI's potential to transform, adding the technology could "reduce teachers' workloads" by assisting with lesson planning and marking.

Half of the 58 schools in England taking part in the EEF's project will be given a toolkit to create assessment materials such as practice questions, exams and model answers, and to tailor lessons to specific groups of children. The AI-generated lesson plans will be assessed by an independent panel of experts.

Prof Becky Francis, the chief executive of the EEF, said: "There's already huge anticipation around how this technology could transform teachers' roles, but the research into its actual impact on practice is – currently – limited.

"The findings from this trial will be an important contribution to the evidence base, bringing us closer to understanding how teachers can use AI."

According to the survey, there is a consensus amongst students that institutions should provide them with more AI-based tools.

While three-in-10 (30 per cent) agree or strongly agree their institution should provide such tools, fewer than one-in-10 (9 per cent) say they currently do so.

Over the past year, there has been an explosion of interest in generative artificial intelligence tools, which are capable of creating new content such as text, images and video.

Many people are excited by its potential to enhance learning, support students and reduce both student and staff workload. But there is equal concern over a potential epidemic of AI-based cheating.

Hepi's study found more than a third of students who have used generative AI (35 per cent) do not know how often it produces made-up facts, statistics or citations.

Guadamuz said he had essays handed in last year that clearly used unedited ChatGPT output, given away by the "boring" style in which they were written.

"The world is evolving, and as educators, we need to adapt by establishing clear guidelines and policies, as well as designing more challenging assessments. However, this is difficult in a resource-constrained environment where academics are already overburdened and underpaid," Guadamuz added.

However, the survey found that as AI usage has spread, fewer students were willing to use it, suggesting a more cautious approach to the technology may be on the rise.