- The Washington Times - Thursday, May 7, 2026

Arizona State University is turning heads with a first-of-its-kind AI course builder that generates automated lessons from professors’ lectures and materials within minutes.

Dubbed “ASU Atomic,” the experimental program invites users who pay $5 a month to describe their learning interests and goals to an AI chatbot named Atom, which draws on years of online classes to create a personalized course with assignments, tests and videos within five minutes.

“Learn only what you want, when you want to,” the ASU Atomic website states. “Atom generates custom, self-paced learning modules using ASU course content — designed specifically for your individual learning objectives and schedule.”



As of Thursday, the pilot was full and no longer accepting signups due to “significant interest.”

Advocates say the non-credit program could do for higher education what streaming did for television — connecting people more quickly to desired content at a fraction of the cost.

“Imagine that we have thousands and thousands and thousands of courses,” ASU President Michael Crow told the Arizona Board of Regents in February. “And you can break these courses down into tens of thousands or hundreds of thousands of sub-component parts.”

Some professors have pushed back, arguing that agreements they signed to share online lessons with colleagues did not include artificial intelligence repackaging their work. They have labeled the program a “Frankenstein” that could misrepresent their classes and expose faculty who teach race and gender ideology lessons to targeted harassment.

Historian Donald Critchlow, director of ASU’s Center for American Institutions, predicted the program will grow regardless, as it lowers payroll costs and gradually produces stronger courses.

Advertisement
Advertisement

“What should be of greater concern is that faculty will be gradually replaced over time by AI bots,” Mr. Critchlow said. “ASU faculty have called ASU Atomic a Frankenstein, but in the novel and movies, the monster dies. In this case, the life of this AI creation has just begun and will grow and mature over time.”

ASU Atomic arrives as universities nationwide embrace artificial intelligence in the classroom. Stanford and Georgia Tech are also experimenting with AI-assisted course design.

Meanwhile, MIT, Harvard and Wharton have launched certifications with AI-powered teaching assistants. Duke encourages classes to use ChatGPT and Gemini to generate practice exams. The University of California, Berkeley has integrated AI into hybrid programs blending online modules with in-person classes.

Some campuses have gone further. Starting with this fall’s freshmen, Purdue will mandate an “AI working competency” for all undergraduates, partnering with Google to assess five graduation requirements centered on learning and using AI.

Dan Ye, a Maryland-based AI expert who lectures at Johns Hopkins, said Arizona State’s program could help higher education adapt to seismic changes on the horizon. He estimated that 80% of the nation’s colleges could face bankruptcy within 15 years as automation reshapes the white-collar job market and tuition costs approach $60,000 a year.

Advertisement
Advertisement

“The hard truth is that we may no longer need as many professors,” Mr. Ye said. “If AI can effectively teach foundational, factual courses like Calculus or Introductory Physics, then a smaller faculty footprint is a logical outcome of progress.”

The online learning platform Coursera estimated in a February report that 95% of faculty and staff were already using AI for personalized learning and real-time feedback. Studies have found AI learning modules especially helpful for math and science, where professors cannot provide individualized attention to large classes.

“The biggest benefit is scale,” said Marlee Strawn, co-founder of Scholar Education, which develops AI tools for K-12 classrooms. “In both K-12 and higher ed, one instructor is trying to meet the needs of many learners at very different levels. AI can help close that gap when it is used well.”

It remains to be seen whether ASU Atomic solves that problem or worsens it by presenting fragmented materials without the necessary context.

Advertisement
Advertisement

“The big question is whether this new tool will benefit students,” said Jonathan Zimmerman, a University of Pennsylvania education history professor. “Will the university conduct research to determine if the tool helps students learn? Or is the real goal here to enhance profit, not education?”

Governance failure

Critics say Arizona State has moved too fast and carelessly. The subscription site offers non-credit courses on subjects such as “starting a coffee roastery in retirement,” raising questions about how Atom restructures more rigorous academic content.

“ASU executed this concept poorly,” said Bob Hutchins, a behavioral psychologist and CEO of AI literacy company Human Voice Media. “It created a paid product using faculty-developed content without even asking the faculty members involved if that was okay. This is a governance failure.”

Advertisement
Advertisement

Doug Hughes, CEO of Boston education technology company Codio, said AI-generated content risks becoming generic without strong oversight.

“AI is making jobs more complex, not simpler,” Mr. Hughes said. “AI can get you 80-90% of the way there, but the last 10-20% requires structured human oversight.”

Omekongo Dibinga, a professor affiliated with American University’s Antiracist Research and Policy Center, warned that ASU Atomic poses a personal threat to faculty — potentially allowing vigilantes enforcing the Trump administration’s crackdown on race-based and transgender lessons to publicize out-of-context video lectures on sensitive topics.

“This is scary,” Mr. Dibinga said. “For $5, ASU is allowing potential stalkers to learn as much about professors as possible and share their information with the world, which will lead to more targeting.”

Advertisement
Advertisement

Steve Rosenbaum, executive director of the Sustainable Media Center, said public universities have an obligation to be transparent, but accuracy matters too.

“A lecture excerpted from a broader discussion can look very different on its own,” he said. “That creates real reputational risk, even if the original teaching was responsible and well-framed.”

Ashish Bansal, founder of AI math tutoring program StarSpark.AI, said ASU Atomic has yet to demonstrate that it can tackle these concerns.

“There is no apparent pedagogical model,” Mr. Bansal said. “What it produces is a chatbot in front of chopped-up faculty videos. That is exactly why the output reads as ’Frankenstein.’”

• Sean Salai can be reached at ssalai@washingtontimes.com.

Copyright © 2026 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.