Samantha Gleisten has made her share of mistakes teaching generative artificial intelligence to middle school students in Chicago.
When she first invited a group of eighth graders to create chatbots with a software program two years ago, one trained his AI to be a narcissist who gave antagonistic “I’m better than you” responses. Another created an AI Snoop Dogg who veered into inappropriate drug references in improvised rap lyrics.
She soon learned how to be quicker at reining students in — and choosier about selecting AI software with guardrails for children that she could tailor to the classroom.
“I wanted to show my students how to engage a new technology, but I didn’t stop to think what was appropriate,” said Ms. Gleisten, who directs education technology at Rogers Park Montessori School and co-founded the company AI Education last year. “Fortunately, it didn’t get scary, and now I know how to check the privacy policies and vet the tools I’m using.”
She’s one of thousands of K-12 teachers who have worked to transform AI chatbots from a nuisance into a necessity since ChatGPT took campuses by storm in late 2022.
Education insiders interviewed by The Washington Times said the evolution of AI in schools has unfolded in three stages: banning generative AI to prevent cheating, developing AI usage policies and requiring “strategic integration” of AI literacy instruction.
“Initially, there was panic, fear about cheating, misinformation, loss of jobs, but the conversation has matured,” said Gadi Kovler, CEO of Radius, an AI platform for teachers. “Students don’t need to study AI as a concept as much as they need to be flexible, critical thinkers who can adapt to rapidly evolving tools and workflows.”
Generative AI platforms like ChatGPT allow users who pose written or verbal questions to create new text, images and music from an ever-expanding database.
Most teachers initially resisted AI as a threat to traditional learning, then gradually embraced it.
Turnitin.com — a website teachers use to detect plagiarism in assignments — released an AI-detection tool in March 2023 that claimed to be 97% effective at flagging computer-generated writing in essays.
As more schools adopted AI for learning feedback, tutoring and group projects, Turnitin.com pivoted this year.
In March, the company announced the launch of Turnitin Clarity, a “composition workspace” to help students “draft writing assignments with transparency” and receive AI-generated feedback to improve their work.
The new program’s AI writing assistant uses a teacher’s assignment instructions to guide students in writing and editing a submission over multiple sessions. It includes a video playback feature that lets teachers review a student’s entire drafting process, including copied-and-pasted text and typing patterns.
“AI-generated writing is not a binary concept with rigid lines around what is or is not acceptable,” said Annie Chechitelli, Turnitin’s chief product officer. “Instead, this technology is a true disruption, requiring us to rethink many aspects of our world.”
While interactive chatbots can pass multiple-choice exams and create deepfake recordings of people’s voices, administrators stress the need to form thoughtful human users to produce deeper insights.
“AI apps allow learners to write, speak, perform and construct a lesson or problem from any location,” said Michael Liebmann, an assistant superintendent at the Matawan-Aberdeen Regional School District in New Jersey. “They cannot replace the relationships that are created between the teacher and the children in the room.”
Beyond essays, generative AI has helped students understand difficult math questions.
The homework-learning app Brainly launched “Ginny” — a ChatGPT-powered chatbot that helps students expand or simplify answers to complex math and science problems as a learning aid — in March 2023.
For example, Ginny can analyze a student’s answer to a difficult Calculus homework or study problem and offer a step-by-step explanation of the correct solution.
In a March 2025 study of 3,682 U.S. high school students, Brainly found that 67% planned to use AI to prepare for their final exams, up from 59% a year ago. Another 80.6% of respondents said AI could improve their grades, up from 77% in 2024.
“We’re realizing that one-size-fits-all AI chatbots aren’t capable of adapting to each student’s individual learning style, emphasizing the need for personalized learning companions,” said Bill Salak, Brainly’s chief technology officer. “It’s important that schools teach students to become strategic users of technology not just as consumers, but as smart, effective decision-makers.”
Rapidly multiplying AI platforms have threatened to overwhelm some campuses.
Heather Peske, president of the National Council on Teacher Quality, said schools still struggle to train teachers how to use AI with appropriate materials.
“There are a lot of ‘resources’ out there that teachers use to supplement their district-provided instructional materials and many of them are low quality,” Ms. Peske said. “Given the nature of AI models, chances are high that AI will draw from these poor materials and perpetuate low-quality instruction.”
Experts urge students to start with the simplest AI platforms and watch carefully for any “hallucinations” that they may produce with false information.
“I recommend sticking to one or two platforms like ChatGPT so that they can learn the ins and outs of that one before exploring the festival of other tools and apps that are springing up every day,” said Dan Ulin, a psychologist who founded the Los Angeles-based Elite Student Coach to help teenagers get into top colleges.
AI literacy
Policymakers on both sides of the aisle have called on K-12 schools to teach AI literacy over the past year.
California Gov. Gavin Newsom, a Democrat, signed a law in October that requires AI literacy instruction in the state’s K-12 classrooms.
President Trump signed an April 23 executive order directing the Education and Labor departments to prioritize funding and opportunities for high school students to take AI classes and certification programs.
“American schools took big steps towards a screen-based educational system during the COVID-19 pandemic,” said Yaron Litwin, chief financial officer of the AI-driven Canopy Parental Control app, which helps parents filter digital content. “Now, they are beginning to implement AI literacy initiatives on the federal, state and local levels.”
Tech industry employers argue that students with AI skills will have a better chance of landing future engineering, science and math-related jobs.
“Students require baseline AI literacy across all subjects, not just in computer science classes,” said Dev Nag, CEO of QueryPal, a San Francisco-based customer support automation company.
Mr. Nag pointed to national surveys showing that the share of teachers using AI jumped from 1 in 5 in early 2023 to more than 40% by the end of 2024. Over the same period, he noted that the share of teenagers using AI increased from 37% to 70%.
Sher Downing, CEO of Downing EdTech Consulting, said schools are moving to integrate AI in three areas from the earliest grade levels: a redesigned curriculum emphasizing human skills, new forms of testing that AI cannot easily replicate and programs ensuring AI access at all socioeconomic levels.
“Successful implementation hinges on using AI to augment rather than replace teaching, establishing clear ethical policies, and fostering teacher experimentation,” Ms. Downing said.
AI has also been effective in connecting emotionally and intellectually with special education students.
“It can help students with autism explore topics they love, ask creative questions, and engage in learning that’s personalized, meaningful and relevant,” said Katie Trowbridge, a Florida-based education consultant and former public high school teacher. “It can adapt content to fit their strengths, offer visuals or simplified language when needed, and even model social scenarios in low-pressure, safe ways that build confidence.”
Lingering concerns
According to education experts, a gradual curriculum of AI literacy from kindergarten through high school will best prepare students for future success.
Nevertheless, financial limitations and lingering concerns about academic dishonesty have kept AI out of many schools.
“When it comes to what to avoid with AI, I would caution against outright banning AI in the classroom,” said Caroline Allen, chief program officer at the right-leaning Center for Education Reform and a former teacher. “I would also advise against relying on AI-generated content without vetting it.”
Cyber safety experts say schools with digital literacy programs to integrate AI in all grades and classes have done better with disciplinary issues than campuses that relegate it to computer science classrooms.
“Rather than banning it altogether, teach students how to use this tool well,” said Allison J. Bonacci, director of education for Cyber Safety Consulting, an Illinois-based company that works with schools to develop internet safety policies. “Age-appropriate AI literacy can be integrated into all classes, not just tech classes.”
According to a 2024 UNESCO report, students’ critical thinking scores rose by 18% on average in schools that introduced AI with digital literacy programs. By contrast, they fell by 9% in schools that allowed AI without a digital literacy program to guide it.
“If students begin to treat AI like a shortcut for thinking, they may lose opportunities to build foundational cognitive skills,” said Marlee Strawn, co-founder of Scholar Education, a company that develops AI tools for K-12 classrooms.
Dana Bryson, senior vice president of social impact at the online learning platform Study.com, said another problem is that poor and minority communities have lagged in teaching AI.
She pointed to a recent Study.com survey that found 54% of teachers saw the promise of AI for individualized learning, but 64% worried it would contribute to “wider learning gaps.”
“Affluent communities and schools have more quickly embraced AI tools, while schools serving under-resourced households are often left out or even avoid them altogether,” Ms. Bryson said. “That tells us AI is neither inherently good nor bad. It’s a tool, and how we use it will determine whether it helps close gaps or deepen them.”