| |

ASU Built an AI That Repackages Professors. Nobody Asked Them.

Imagine logging into your university’s learning platform and finding your face staring back at you. Your lecture, chopped into a 60-second clip. Your ideas, stripped of context. Your name, attached to content you never approved. All available to anyone with $5 and a subscription.

That is not a hypothetical. It happened this week at Arizona State University.

What ASU Actually Built

ASU quietly launched a platform called Atomic, a subscription-based web app that uses AI to scrape faculty lectures, slide decks, and course materials from the university’s learning management system. It then chops those materials into short clips, generates AI text around them, and packages everything into personalized “learning modules” for anyone willing to pay $5 a month.

The platform is not for enrolled students. It is for the public. Non-credit offerings range from professional development topics like consulting and project management to oddly specific modules like starting a coffee roastery in retirement.

Here is the part that should concern every educator reading this: the professors whose content powers Atomic were never told it existed.

The Faculty Found Out on Social Media

Chris Hanlon, a literature professor at ASU, discovered the platform when he stumbled across his own face inside an AI-generated module. He described the result as “Frankensteinian.” A 12-minute lecture he had carefully constructed was slashed to just over 60 seconds, embedded inside AI-generated text that attempted to contextualize it and failed.

He reached out to colleagues whose content also appeared in Atomic. Not one of them had been consulted. Not one had heard of the platform before it launched.

“I haven’t spoken to a single faculty member who knew anything about Atomic,” Hanlon told reporters. Multiple outlets, including Inside Higher Ed, 404 Media, and local Arizona news stations confirmed the pattern: faculty blindsided by an institutional AI deployment using their own work.

Michael Ostling, a religious studies professor, attended a faculty Q&A where ASU President Michael Crow was asked about the platform. Crow acknowledged the project was “early stages” and “not ready for prime time.” He had previously described a similar concept, which he called “Project Atomizer,” to the Arizona Board of Regents in February. The name tells you everything about the philosophy: take whole courses and atomize them.

Like what you’re reading?

Get insights like this delivered to your community feed, plus free AI courses and discussions with other educators building AI fluency.

Join the Free Community

The Content Is Bad. That Makes It Worse.

If Atomic produced excellent learning experiences, the consent issue would still be a problem. But it does not. Faculty and journalists who tested the platform found academically weak content riddled with errors. One module garbled a reference to literary critic Cleanth Brooks. Context was stripped so aggressively that the remaining clips were, in Hanlon’s words, “devoid of context, just chopped up, kind of AI slop.”

Sam Cole, co-founder of 404 Media and the journalist who broke the story, put it plainly: the content is “pretty bad” and “full of errors.”

This matters because it reveals the core flaw in the extraction approach. When you strip an educator’s work from the context they built around it, you do not get a faster version of the same knowledge. You get a distorted version. The pacing, the framing, the careful sequencing that turns information into understanding: all of that disappears when AI chops a 12-minute lecture into 60 seconds of clips surrounded by generated text.

Why the IP Policy Makes This Possible

ASU’s intellectual property policy gives the Board of Regents ownership of “any intellectual property created by a university or Board employee in the course and scope of employment.” Content uploaded to Canvas, the university’s LMS, can be redistributed under the platform’s terms.

This is not unique to ASU. The American Association of University Professors has warned for years that universities are increasingly asserting ownership over course materials, particularly those created for online delivery. The trend accelerated during the pandemic when faculty rushed materials online, often without clear agreements about how that content would be used after the emergency ended.

The legal framework may permit what ASU did. But legality and wisdom are not the same thing. Ostling raised a chilling secondary concern: Atomic could make it trivially easy for political actors to pull decontextualized clips of professors teaching about race, gender, or conflict and weaponize them. Teaching is contextual. Remove the context, and you remove the protection that careful pedagogy provides.

The Real Problem: Extraction vs. Empowerment

Here is the question every institution should be asking right now: are we using AI to extract value from our educators, or to empower them?

The extraction model treats faculty expertise as raw material. Something to be scraped, atomized, repackaged, and sold at scale. The educator becomes a commodity input, not a professional partner. Their judgment about how to sequence, frame, and deliver knowledge gets replaced by an algorithm that cannot distinguish between a carefully constructed argument and a random 60-second clip.

The empowerment model looks completely different. It gives educators AI tools that amplify their expertise, extend their reach, and let them serve more learners without losing the contextual intelligence that makes teaching work. The educator remains at the center, using AI as an instrument rather than being instrumentalized by it.

This distinction is not academic philosophy. It is the dividing line between institutions that will earn the trust of their faculty and institutions that will lose it.

What This Means for You

If you are an educator at any institution, this story should prompt three immediate questions:

First, what does your institution’s IP policy actually say? Most faculty have never read theirs. Now is the time. Understand what rights you retain over your course materials, particularly anything uploaded to an LMS. The AAUP maintains guidance on faculty intellectual property rights that can help you evaluate your own institution’s policies.

Second, how is your institution deploying AI internally? ASU faculty did not know about Atomic until it was already live. That information asymmetry is a governance failure. If your administration is piloting AI tools that touch instructional content, you should know about it before launch, not after.

Third, are you building your own AI fluency? The best defense against being reduced to raw material for an AI system is becoming someone who understands how those systems work. When you can evaluate what AI does well and where it fails, you can advocate from a position of knowledge rather than anxiety. The educators who build AI literacy now will be the ones who shape how AI is used in their institutions, not the ones who discover their lectures have been atomized without their knowledge.

The Seed

ASU’s Atomic platform is not the future of AI in education. It is a cautionary tale about what happens when institutions treat AI as a tool for extraction rather than empowerment. The technology itself is not the problem. The philosophy behind it is.

Your expertise has value. Your pedagogical judgment has value. The context you build around your content is not packaging to be discarded. It is the teaching itself.

AI will reshape how education is delivered. That is not in question. The question is whether educators will be partners in that transformation or raw material for it. The answer depends on whether you are building your own AI fluency now, or waiting until someone else decides how your work gets used.

That is today’s seed. What you do with it is up to you.

Ready to go beyond reading and start building AI fluency?

Join the free Harvest Kernel community for daily discussions, hands-on AI courses, and a network of educators and professionals who are building real AI skills together.

Join the Free Community

Dean Le Blanc

Founder, Harvest Kernel

Building AI literacy for educators and professionals through the SeedStacking™ methodology. Join the community →

Similar Posts

Leave a Reply