|

The Right to Refuse AI Isn’t Anti-Technology — It’s Pro-Literacy

When Saying No Becomes the Smartest Move in AI Education

Share

Here’s a phrase that shouldn’t sound radical in 2026 but somehow does: “Students and teachers have the right to refuse generative AI in the writing classroom.”

That’s the core of a resolution the Conference on College Composition and Communication — the world’s largest professional organization of writing educators — overwhelmingly passed at its annual convention in Cleveland earlier this month. And the reaction it’s sparking reveals everything about the state of AI literacy in education right now.

On one side, you have faculty who see mandatory AI adoption as a threat to academic freedom, critical thinking, and the fundamental purpose of writing instruction. On the other, you have educators who argue that refusing AI is like refusing the internet — a well-intentioned stance that leaves students dangerously unprepared. Both sides are making valid points. But almost everyone is missing the deeper question.

The real issue isn’t whether to use AI or refuse it. The real issue is whether anyone involved has enough AI literacy to make that choice intentionally.

The Data Behind the Resistance

The CCCC resolution didn’t emerge from paranoia. It emerged from data. A 2025 survey by the American Association of University Professors found that fifteen percent of faculty said their college or university mandates the use of AI. Eighty-one percent reported being required to use learning management systems and other educational technology embedded with AI tools they cannot turn off. At the same time, sixty-nine percent said AI is hurting student success, and ninety-five percent stressed the importance of meaningful opt-out policies.

Now, you might be thinking: “Ninety-five percent want opt-out policies? That seems extreme.” But here’s what that number actually reflects. It’s not ninety-five percent of professors saying AI is bad. It’s ninety-five percent saying they should have the professional autonomy to decide how and whether to use it in their classrooms. That’s a fundamentally different claim — and it’s one that aligns perfectly with what genuine AI literacy actually requires.

95%of faculty surveyed want meaningful opt-out policies for AI tools in educationAmerican Association of University Professors, 2025

Jennifer Sano-Franchini, an associate professor at West Virginia University and immediate past chair of the CCCC, put it directly: the claims that AI is inevitable and that students must learn to use it for their careers are assumptions worth unpacking, not mandates worth following blindly.

The False Binary That’s Hurting Everyone

Most articles about AI in education fall into one of two camps: “embrace AI or get left behind” versus “protect human learning at all costs.” But that framing is a trap. It forces educators into reactive positions instead of reflective ones.

The CCCC resolution is actually doing something more sophisticated than either camp allows. It’s asserting that the choice itself — the informed, deliberate decision to use or refuse a technology — is the educational act. Not the tool. Not the output. The critical thinking that precedes the decision.

This is what we call the Literacy-First Principle at Harvest Kernel. You can’t meaningfully adopt a technology you don’t understand. And you can’t meaningfully refuse one you don’t understand either. Both require literacy. Both require the kind of structured understanding that turns anxiety into agency.

Like what you’re reading? Get insights like this delivered daily. Join the free community →

Urgency is meant to flood the mind of the customer so they can’t take a pause to consider whether the thing that’s being sold really needs to be bought. I see no need for urgency.

Sarah Drimmer, via Inside Higher Ed, on AI adoption pressure in higher education

Why the Counter-Arguments Actually Prove the Point

The backlash to the CCCC resolution has been fierce, and honestly, some of it is worth hearing. One composition professor created a counter-petition arguing that refusing AI in 2026 is equivalent to a professional handicap — that students who graduate without AI fluency will be at a measurable disadvantage in the workforce.

That argument has teeth. But here’s where it actually reinforces the case for literacy over mandates: if AI fluency is genuinely essential for career readiness, then the worst possible approach is forcing students to use tools they don’t understand in contexts where they haven’t been taught to evaluate them critically. That’s not education. That’s exposure without comprehension. And exposure without comprehension is how you get the exact AI-dependent, non-critical workforce that both sides claim to be worried about.

Like what you’re reading? Get insights like this delivered daily.

Join the free community →

The SeedStacking approach offers a third path. Instead of mandating adoption or endorsing refusal, it starts with understanding. Seed the foundational knowledge. Then let the practitioner — whether that’s a writing professor, a K-12 teacher, or a solo professional — make an informed decision about integration. The choice becomes evidence-based rather than fear-based or hype-based.

What Universities Are Getting Wrong About “Falling Behind”

There’s a phrase that keeps appearing in every AI adoption memo circulating through higher education: “We can’t fall behind.” But as one scholar noted in the discussion around the CCCC resolution, nobody is asking the essential follow-up question: fall behind what, exactly?

Universities have signed multimillion-dollar agreements with technology companies to provide campus-wide access to proprietary generative AI services. Faculty and students report being left out of those decisions entirely. The pressure to adopt isn’t coming from pedagogy research demonstrating that AI improves learning outcomes — because that evidence barely exists yet. The pressure is coming from vendor marketing, institutional fear of irrelevance, and a technology industry that benefits enormously from universal adoption.

Now, that doesn’t mean the tools are worthless. It means the adoption curve has outpaced the evidence curve. And when adoption outpaces evidence, the people who slow down to ask questions aren’t falling behind. They’re the ones building on solid ground while everyone else is building on assumptions.

The Harvest Kernel Takeaway

The right to refuse AI is inseparable from the right to understand AI. Faculty who demand the space to evaluate these tools before deploying them aren’t resisting progress. They’re modeling exactly the kind of critical, evidence-based thinking we claim AI literacy is supposed to teach. If we can’t extend that same critical standard to our own adoption decisions, we’ve already failed the literacy test ourselves.

What This Means for Your Classroom, Your Office, Your Life

This debate isn’t just about writing professors. It’s about every educator, professional, and lifelong learner navigating the pressure to adopt AI tools right now. The principle is transferable: informed choice requires literacy. Literacy requires structured learning. Structured learning requires the freedom to question before you commit.

If you’re an educator being told to integrate AI into your curriculum without training, without evidence, and without the option to decline — you’re experiencing the exact problem the CCCC resolution is trying to address. If you’re a professional being handed new AI tools with the expectation that you’ll figure them out on the fly — same dynamic.

The path forward isn’t adoption-first or refusal-first. It’s literacy-first. And literacy-first means building genuine understanding through small, consistent steps — which is the entire foundation of the SeedStacking methodology. One concept at a time. One tool at a time. One intentional decision at a time.

Because here’s the truth the AI hype cycle doesn’t want you to hear: the people who will thrive in an AI-saturated world aren’t the ones who adopted fastest. They’re the ones who understood deepest.

Ready to go beyond reading and start building AI fluency?

Join the free Harvest Kernel community for practical guidance, fresh ideas, and tools that help you make AI useful in real life.

Join the Free Community

Sources

  1. Inside Higher Ed, “Writing Faculty Push for the Right to Refuse AI,” March 16, 2026
  2. American Association of University Professors, Faculty Survey on AI in Higher Education, 2025
  3. Conference on College Composition and Communication, 2026 Resolution on Refusing Generative AI in Writing Studies
  4. MLA-CCCC Joint Task Force on Writing and AI, Working Paper 3: Building a Culture for Generative AI Literacy, 2024
Dean Le Blanc, Founder of Harvest Kernel

Dean Le Blanc

Founder, Harvest Kernel

AI literacy educator and creator of the SeedStacking methodology. Dean teaches educators, professionals, and lifelong learners how to build genuine AI fluency through small daily wins that compound into real capability. Join the Learning Community →

Similar Posts