AI in Marketing

AI as a College Major: We're Credentializing Something We Barely Understand

Written by Writing Team | Dec 5, 2025 1:00:00 PM

The University of South Florida enrolled over 3,000 students this semester in a new college of artificial intelligence and cybersecurity. MIT's "AI and decision-making" major is now the second-largest program at the school after computer science. Dozens of universities have announced AI departments, majors, minors, and interdisciplinary concentrations in the last two years alone.

@aeyespybywinsome

credentializing? credentialozing? credentialified?

♬ original sound - AEyeSpy

We're credentializing AI at breathtaking speed—creating four-year degree programs around technologies that didn't exist in their current form when these students were in high school. And nobody seems particularly bothered by the absurdity of teaching people to master a field that's rewriting itself every six months.

The Boom Is Real (And Completely Predictable)

ChatGPT made AI feel accessible and urgent. Nvidia's valuation made it feel lucrative. Google and Microsoft announcing programs to train millions of students and workers made it feel inevitable. Of course colleges are racing to meet demand. They'd be irresponsible not to.

Students see AI majors as a bet on employability in a market where tech jobs command premium salaries and AI expertise is the credential du jour. Universities see AI programs as enrollment drivers in an era of declining college attendance and mounting questions about ROI. Companies see trained AI talent as the bottleneck limiting their ability to deploy these technologies at scale.

Everyone has rational reasons for participating in this boom. That doesn't mean the boom makes sense.

What Does an AI Major Even Teach?

MIT's program focuses on developing AI systems and studying how technologies like robots interact with humans and the environment. That's admirable and substantive. It's also teaching students to build systems whose capabilities and limitations are still being actively debated by the people who created the underlying technologies.

You can teach machine learning fundamentals. You can teach neural network architectures and training methodologies. You can teach the mathematics of optimization and the statistics of model evaluation. These are real, technical skills with decades of research backing them up.

But AI as a field right now isn't just evolving—it's convulsing. The techniques that dominate today might be obsolete by the time these students graduate. The ethical frameworks we're teaching may prove inadequate the moment these systems hit production at scale. The "best practices" being codified in curricula are provisional at best, marketing mythology at worst.

We're not teaching timeless fundamentals like calculus or thermodynamics. We're teaching a discipline that's still figuring out what it is.

The Credentialization Trap

Creating formal degree programs around AI serves important functions: it legitimizes the field, creates pathways for research funding, and provides structure for students who want expertise but don't know where to start. It also creates a dangerous illusion that AI is a solved domain with established knowledge ready to be transmitted through traditional academic structures.

Four years is a long time in AI. Students enrolling today will graduate into a market where the models they learned on are obsolete, the tools they mastered are deprecated, and the problems they studied may have been solved—or revealed to be intractable—by technologies that didn't exist when they started college.

This isn't unique to AI. Computer science has always dealt with rapid technological change. But there's a difference between teaching programming principles that transfer across languages and teaching students to be "AI experts" when expertise itself is poorly defined and rapidly shifting.

The Industry Demand Argument

Defenders of AI majors will point to industry demand. Companies need people who understand these technologies. Fair enough. But what companies actually need are people who can think critically about AI applications, understand limitations, identify appropriate use cases, and navigate the gap between marketing claims and technical reality.

You don't necessarily need a four-year AI degree to develop those skills. You need strong foundational knowledge in computer science, statistics, and whatever domain you're applying AI to—healthcare, marketing, logistics, finance—plus hands-on experience with current tools and a healthy skepticism about vendor promises.

The risk is that AI majors produce graduates who know how to use today's tools but lack the foundational depth to adapt when those tools change or the critical perspective to question whether AI is even the right solution to a given problem.

What Universities Should Actually Be Teaching

If we're going to have AI majors, they should emphasize fundamentals over frameworks, principles over specific implementations, and critical thinking over technical skills alone. Students need:

Strong mathematical foundations in linear algebra, probability, and optimization. Deep understanding of computer science fundamentals including algorithms, data structures, and systems design. Statistical literacy that goes beyond running models to understanding what results actually mean. Domain expertise in at least one applied field where AI creates value. And critically, courses on ethics, bias, failure modes, and the social implications of deploying these technologies at scale.

That's a computer science degree with a specialization, not a standalone AI major. But "AI major" sells better to prospective students and their parents who want assurance that four years and $200,000 will lead to employable skills.

The Uncomfortable Reality

We're creating AI majors because there's demand, not because we have a coherent vision of what AI expertise means or confidence that what we teach today will matter in four years. That's not a failing of universities—it's an honest response to market pressure and genuine student interest.

But we should at least be honest about what we're doing: betting that formal credentials in a rapidly changing field will provide value to students despite the near certainty that much of what they learn will be outdated before they graduate.

Maybe that's fine. Maybe the real value of an AI major isn't the specific knowledge but the credential itself, the signal to employers that someone was committed enough to dedicate four years to studying this field regardless of whether the details remain relevant.

Or maybe we're just credentializing hype because that's what institutions do when faced with genuine technological disruption and genuine student demand.

Either way, we're going to find out. Three thousand students at one school. Hundreds more at MIT. Dozens of new programs launching across the country. The experiment is underway whether we're ready or not.

If you're trying to hire AI talent or build AI capabilities without getting swept up in credentialism, we can help you focus on what actually matters.