Almost 80% of Australian uni students now use AI. This is creating an ‘illusion of competence’

Thai Lian Lim/ Getty Images

In Australia, artificial intelligence is becoming a near-universal feature of education.

As of 2025, nearly 80% of university students reported using AI in their studies. Overseas, reports are even higher. This year, a UK survey of undergraduates found 94% were using it to help with assessed work.

This has ushered in widespread concerns about students using AI to cheat on their work and exams. But in a new report with colleague Leslie Loble, we argue there is a far greater risk.

There is a growing body of evidence that suggests using AI can undermine the effort required for sustainable, deep learning. This so-called “cognitive offloading” from human to AI is especially risky for younger students as they are still building their basic knowledge and skills.

The ‘performance paradox’

Our report highlights a phenomenon known as the “performance paradox”. This is where students’ short-term performance on tasks may improve with AI. But their long-term learning is being harmed.

An example of this is seen in a 2025 randomised experiment with high school students in Turkey using an AI assistant (that could tutor them through answers). In classroom tasks, they appeared to solve maths problems more effectively using AI. However, their actual learning fell off a cliff as soon as the AI was removed in an assessment.

These findings suggest while AI can boost immediate results, it can simultaneously diminish the durable knowledge that is the true goal of education. In the meantime, students can overestimate how much they have learned. AI gives them the illusion of competence.

AI is so easy to use

Generative AI can certainly provide clear, polished responses to students. Research tells us this can signal to the learner that deep mental engagement is no longer necessary.

This same research also shows students are then less likely to plan, monitor and revise their work. This is because the tool is doing this for them.

This situation creates a cycle where the ease of AI-generated responses erodes a student’s actual knowledge base, making them more dependent on the tool and less able to judge its accuracy in the future.

Critical thinking is not a generic skill – it is deeply intertwined with knowledge.

In other words, it is difficult to critically analyse a response about the second world war (is it biased? Have they got the dates wrong?) if you don’t know anything much about the different participants and their perspectives.

How can we respond?

To address this, universities and teachers must move from treating AI as an “answer oracle” to using it as a partner in thinking and learning. There are two key ways to do this.

  • Use AI to offload extraneous tasks – such as checking grammar or formatting citations. This frees up mental space to concentrate on learning. But is not relying on the AI to tell students what or how to think.

  • Use as AI as a “cognitive mirror”. Instead of giving answers, the AI asks clarifying questions. This forces the student to engage in explanation, which helps them build lasting learning. For example, if a student provides a vague argument in an essay, the AI might ask them to define their core assumptions more specifically.

Most importantly, the development of AI tools must focus on helping and building the teacher’s capacity, not just the students’ immediate performance. As powerful as AI might be, humans learn better with and from other humans.

By giving AI tools to expert teachers to help them increase their capacity, we ensure technology bolsters student learning. For example, AI could be used to analyse student performance data in real-time to highlight which small groups or individuals need a human intervention most urgently.

What is this all for?

Education systems need to help students understand and be comfortable with the fact that long-term learning can take time and needs effort. If AI is used to replace the struggle of learning, there is a risk of the erosion of cognitive skills.

The goal here is not to protect students from AI but to prepare them to live and work with it.

The Conversation

Jason M. Lodge receives funding from the Medical Research Future Fund. He has, in the past, received funding from the Tertiary Education Quality and Standards Agency (TEQSA), Australian Department of Education, Queensland Department of Education, the Office for Learning and Teaching, the Higher Education Research and Development Society of Australasia, the Australian Society of Computers in Learning in Tertiary Education and Echo360.

Scroll to Top