Why AI Abstinence Policies Are Backfiring

I believe we need a new term in our conversations about AI in education: AI Abstinence.

When I read through school districts’ AI policies, they often read more like legal protection documents than visionary statements about learning. Typically, they state that students can't use AI without parental permission. Then comes the blanket requirement: all writing must be the student’s original work or else there will be consequences.

After interviewing both teachers and students, I started calling this approach AI Abstinence, a directive telling students to stop doing something that’s incredibly tempting.

These policies often label AI use as “cheating,” despite the fact that a growing number of people, teachers included, use it daily. At their core, these policies reflect a power dynamic: “You must do what I say.” But much like traditional abstinence programs, students simply aren’t following the script.

When I speak with administrators and teachers, I often hear, “My students don’t use AI. And if they did, I would know.”

As an AI interaction researcher, I had to ask: Is AI Abstinence actually working? Or do students have a different perspective?

I surveyed 200 high school and college students and interviewed 20 in-depth about how they use AI in schoolwork. You can watch my standing-room only talk at SXSWEDU on the topic here:

AI Use Is Prolific

Every student I interviewed had used AI for classwork. In fact, most were using it for every assignment.

“Pretty much all of my friends use AI every time.”

One student even likened AI to a drug:

“I don’t drink, but it’s like testing alcohol. You try it once, then the next day you want more. Soon, it’s just how you do things.”

I asked him to show me an assignment he had written without AI. He couldn’t think of one. AI had written his entire paper on “Why Integrity Is Important for Leadership.”

This blog aims to unpack what’s led to AI becoming the default writing strategy for so many students.

Why Are Students Using AI So Much?

1. AI Detectors Aren’t Catching AI Use

Students often start by asking ChatGPT to write their essay, then paraphrase it line by line. They talk about this process casually, like running spell check.

“I just used AI for the rest of the paper,” one student told me. “I try to change some words so it doesn’t look like AI in my teacher’s eyes.”

Tools like Turnitin claim to detect AI with 99% accuracy, but that number only applies to unedited, fully AI-generated text. Once a student paraphrases it, the detection fails. It’s human writing, technically, but not human thought.

2. AI Is Improving Students’ Grades and Efficiency

“ChatGPT helps me sound more intelligent—like I really know what I’m talking about.”

Many students don’t feel guilty about using AI. In fact, some take pride in optimizing their schoolwork.

One student told me he turned in a fully AI-written paper, and the teacher praised it in front of the class for going  “above and beyond.”

For others, AI is simply essential to meeting basic expectations:

“My teacher wants 300 words. I can’t even think of that. So I use AI. It helps me get stuff in.”

Another student used AI to “Write a letter to your mom.” Assignments that once felt manageable now seem impossible without AI.

That worries me. Will the next generation of writers ever experience the challenge and growth that comes from writing without AI?

3. The Ethics of AI Abstinence Don’t Register

AI Abstinence is built on the idea that AI is harmful to students. But their lived experiences tell a different story: their papers are stronger, they have more free time, and they’re getting better grades.

“In the end, I’m now a better student, a better personm because this is the future. This is how things are done now.”

Some students even feel that they’re learning more with AI:

“I don’t think I’m missing out. I actually feel like I’m learning more. It’s like a smarter version of a human brain.”

Of course, teachers know the difference between reading something and knowing it. And generating  your own ideas is an entirely different level of learning.

4. We’re Not Having Transparent Conversations with Students

Under AI Abstinence, teachers aren’t allowed to talk openly about when and how AI should or shouldn’t be used. Many teachers told me they’re afraid to accuse students of AI use for fear of damaging the student-teacher relationship.

As a result, students are left to formulate their own AI ethics. They typically don’t know when AI is helping them learn or when it’s doing the thinking for them. They don’t see that writing is more than just “getting it done.” They don’t see that writing is about developing higher-level skills like critical-thinking and voice.

Rampant AI Use Is Preventing Critical Thinking

A blank page is intimidating. AI can eliminate that anxiety with the push of a button.

But it’s in that struggle with the blank page that real learning happens.

Writing about “Why I want to go to college” requires self-reflection and voice (Elbow, 1998). Writing “Why integrity matters in leadership” asks students to think critically about values (Vygotsky, 1978). Writing about environmental protection according to a textbook helps internalize those lessons (Wood, 1976).

Good teachers don’t assign writing just to check off that students read the material. The real learning is in the writing: grappling with ideas, clarifying arguments, and shaping a message.

Moving Beyond AI Abstinence

In a previous article, I wrote about how students often lack ways to feel that they’re learning. The same applies here: students can’t always see the value in abstract skills like “critical thinking,” “voice,” or “creativity.” These skills are hard to measure, so they are easy to ignore.

Just as the risks of premarital sex aren’t immediately visible, we need better ways to help students understand the long-term dangers of over-relying on AI.

One method I tested was using an AI-powered Socratic tool that asked students to explain the reasoning behind their AI-written papers. Every student failed.

One senior planning to study environmentalism could only say the word “deforestation” when asked to explain climate change.

But the important part isn’t that I “caught” her—it’s that she caught herself. She realized she didn’t internalize the reading. She didn’t know the material.

What Could Come Next?

What other tools can we build to make critical thinking more tangible? I have been experimenting with:

  • A final paper that’s a documented conversation with Socrates himself.

  • An AI-led podcast where students must own their voice.

  • Level Up, a free, available tool that acts like Grammarly but helps students self-reflect instead of giving answers.

I believe we’re just scratching the surface. There are better ways to teach writing skills that go beyond the five-paragraph essay.

If you're interested in exploring or building these new approaches together, please reach out. I’m actively looking to partner with schools and districts.

Let’s move beyond AI Abstinence—and into a more honest, innovative, and empowering future for student learning.

 

Sign Up for my AI + Design Newsletter

Sources:

Elbow, P. (1998). Writing with power: Techniques for mastering the writing process. Oxford University Press.

Vygotsky, L. S. (1978). Mind in Society: The Development of Higher Psychological Processes.

Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem-solving. Journal of Child Psychology and Psychiatry, 17(2), 89-100.

Next
Next

3 Pillars of Teacher Adoption for Edtech