More and more teachers and students are using AI – even though it might do more harm than good

An estimated 85% of K-12 public school teachers recently reported that they used AI during the 2024-2025 school year. ismagilov/iStock/Getty Images Plus

K-12 teachers and students across the country are increasingly using AI in and out of classrooms, whether it is teachers turning to AI to refine lesson plans or students asking AI to help them research a particular topic.

An estimated 85% of K-12 public school teachers recently reported that they used AI during the 2024-2025 school year – often for curriculum and content development.

In 2023, 13% of teens said they used ChatGPT to complete their schoolwork, while 26% of them said in 2025 that they were using ChatGPT for this purpose.

Similarly, 86% of K-12 students shared in 2025 that they have used AI in general. An estimated 50% of students reported that they use it for schoolwork, such as for learning more about topics outside of what was taught in class, tutoring on specific subjects, receiving help with a homework assignment or asking for college advice.

However, policies and training have not kept pace with how frequently teachers and students are using AI.

Only 35% of school district leaders reported in 2025 that they provided students with any AI training, according to the global policy think tank RAND Corporation. Additionally, 45% of principals reported school or district policies or guidance on the use of AI in schools, according to these findings.

Another challenge is that students are also using AI for potentially dangerous uses. There are recent examples of students who self-harmed or died by suicide after they used AI for mental health support. A 2025 study found that when a chatbot responded to 60 simulated scenarios that posed mental health questions, the chatbots sometimes made harmful proposals – such as cutting off all human contact for a month or dropping out of school.

So, is it safe for young students to use AI? Does using AI provide better learning outcomes for students when compared to traditional instruction? Does AI help teachers reduce their workload?

The answers to these questions are complicated. It is not yet clear how AI influences learning in K-12 settings or when and how it is best for teachers and students to use AI.

A man wearing a grey shirt and dark tie hands a piece of paper to teenagers seated at long white tables.
A high school teacher in Colorado Springs hands out lesson sheets he created with the help of AI in November 2025.
RJ Sangosti/MediaNews Group/The Denver Post via Getty Images

Some clear pros

As an associate professor of inclusive teacher education, I’m trying to answer some of these big questions about AI and K-12 education.

Some university centers that I’ve worked with, such as the Center for Innovation, Design, and Digital Learning at the University of Kansas, are conducting research on how AI can be used to support students with learning disabilities.

In 2025, 57% of special education teachers said they use AI to help develop individualized plans, often called an individualized education program, for their students with learning disabilities.

I believe there is no doubt that AI can, in some ways, reduce barriers and support students with disabilities. In my own research, for example, my co-authors and I show that AI can help students learn by adapting assignments to meet their personal learning needs and pace. It can also help teachers reduce their time spent grading or editing assignments.

There remain concerns over student privacy and whether AI systems will reinforce bias, but special education teachers are testing the benefits of generative AI.

The missing evidence

Among the broader available research and evidence on AI and K-12 education, some studies from 2019 through 2022 show that AI might help students learn and stay motivated by providing a personalized learning experience. However, the evidence appears less promising when considering how students learn after they use AI and then stop using it.

For example, Guilherme Lichand, an economics scholar at the Stanford Accelerator for Learning, found in 2026 that when students use AI and then are told they can no longer use it for their studies, students actually perform worse than those who never used AI. This shows that additional research on how AI influences students’ long-term learning and development is necessary.

The Brookings Institution also recently warned in a 2026 AI and K-12 education report that the risks of using generative AI in education overshadow its benefits. These risks include weakened relationships between students and teachers, as well as students’ safety.

A 2025 report by the nonprofit Center for Democracy and Technology also shows that an average of 71% of K-12 teachers reported that when students use AI to complete their schoolwork, it is hard for the teachers to understand whether student work is their own.

Similarly, almost two-thirds of parents of K-12 students said in 2025 that AI is weakening important academic skills that their child needs to learn, such as writing, reading comprehension and critical thinking.

Lessons from the past

AI is being introduced to K-12 classrooms faster than evidence and understanding can support. But schools have rushed to incorporate educational technologies into their classrooms before.

During the COVID-19 pandemic, for example, schools needed to quickly equip teachers and students with online platforms for remote learning.

But the rush also challenged educators to learn how to effectively teach and provide individual support for each student – and to ensure that all students, including students with disabilities, could participate in remote learning.

Similarly, not long ago, some educators thought that social media and smartphones would bring the next frontier in education, with the idea that these technologies could increase student engagement. Yet we now know the dangers that both social media and smartphones pose for children.

Slowing down how students especially are using AI in the classroom does not mean rejecting it altogether. I think it means being responsible – especially when there is a good chance children’s academic skills, behaviors or emotions are at risk.

New evidence on AI and education is coming from scholars like me and my colleagues. There is little doubt that AI and future technologies are game changers in society and education.

I think it is also critical that we slow down and follow the evidence that is available. Speed is a choice, and education deserves intention.

The Conversation

Tal Slemrod receives funding from the US Department of Education.

Scroll to Top