1. Headline & intro
Computer science was supposed to be “the new literacy.” Now, for the first time in two decades, students at major U.S. universities are quietly backing away from it. They are not abandoning tech altogether; they are running toward anything branded “AI.” That sounds logical in 2026, but it could leave a dangerous skills gap just as societies are wiring everything around artificial intelligence. In this piece, we look at what is actually happening behind the enrollment numbers, why foundations still matter, and where European universities have a rare opportunity to leap ahead instead of repeating Silicon Valley’s mistakes.
2. The news in brief
According to TechCrunch’s reporting on new data from the University of California (UC) system, undergraduate enrollment in traditional computer science programs across UC campuses declined 6% this academic year, after a 3% drop the year before. This is notable because overall college enrollment in the U.S. has inched up by around 2%, based on figures from the National Student Clearinghouse.
One major exception stands out: UC San Diego, the only UC campus that introduced a dedicated AI major this year, saw continued strength. TechCrunch also cites a Computing Research Association survey from October in which roughly three-fifths of U.S. computing departments reported falling undergrad enrollment. Meanwhile, universities from MIT to the University of South Florida, Columbia, USC and others are launching AI‑specific degrees and colleges. Students appear to be shifting from general CS into explicitly AI‑branded programs rather than leaving technology fields altogether.
3. Why this matters
The headline risk is obvious: if fewer students study traditional CS, who will understand the systems behind the AI tools everyone else is using? Modern AI looks like magic, but it stands on a stack of very unglamorous disciplines: algorithms, distributed systems, operating systems, compilers, databases, networking. Those are exactly the courses that get squeezed when departments race to bolt on “AI” in order to keep applicants interested.
There are clear winners in this shift. Universities that move quickly to design serious, interdisciplinary AI programs will attract talent and funding. Students who combine AI with a domain — medicine, finance, design, law — will be in strong demand, because they can translate messy real‑world problems into something a model can actually work with. And the AI industry gains a pipeline of graduates who are already comfortable treating models as everyday infrastructure.
But there are losers, too. Smaller universities that cannot afford new AI institutes may watch their CS enrollment erode without having anything compelling to offer instead. Faculty who refuse to engage with AI risk becoming irrelevant to their own students. Parents who now push kids toward supposedly “AI‑resistant” disciplines like mechanical or electrical engineering may be offering false security; these fields are also being reshaped by simulation, generative design and robotics.
Most importantly, the narrative that “coding is over” encourages a shallow relationship with technology: prompt instead of understand, assemble instead of design. In the short term, that feels efficient. In the long term, it concentrates true technical power in the hands of a shrinking elite that still understands what’s going on under the hood.
4. The bigger picture
This enrollment pivot sits at the intersection of three longer‑running trends.
First, the professionalization of AI. Over the last decade we have already seen statistics, data science and machine learning peel off from classic CS into their own degrees and master’s programs. Today’s AI majors are a continuation of that path, but with generative models and foundation models as the focal point. The danger is that universities confuse renaming with redesign. Updating a brochure is easy; rethinking a curriculum to integrate ethics, security, math and systems with AI is hard.
Second, a global arms race in AI literacy. As TechCrunch notes, Chinese universities treat AI as basic digital infrastructure. MIT Technology Review has described how institutions like Zhejiang and Tsinghua made AI training mandatory or created new AI colleges altogether. Western universities are reacting rather than leading, often stuck in internal fights about plagiarism policies while students quietly adopt the tools anyway.
Third, recurring hype cycles in tech education. During the dot‑com boom, many departments rebranded as “information technology” or “e‑business.” After the crash, enrollments collapsed, but the need for solid computer scientists never went away. Something similar is likely now: AI will remain central, but today’s buzzword‑heavy degrees may age poorly if they are too tied to current tools and not enough to enduring concepts.
Against this backdrop, the UC data is less a one‑off scare and more a signal: students read the labour market much faster than institutions do. They are voting for AI‑flavoured programs because job listings, media coverage and salary surveys all scream “AI” — not “operating systems.” The risk is that industry will still require the latter, but fewer graduates will have mastered it.
5. The European / regional angle
For European universities and policymakers, the U.S. shift is both warning and opportunity.
Many European CS departments are more conservative in structure but stronger in theoretical foundations. That could become a competitive advantage if they resist the temptation to bolt “AI” onto everything as a marketing label and instead weave AI across existing solid curricula. Programs like those at TU Munich, ETH Zurich, EPFL, KU Leuven or TU Delft already blend machine learning, data systems and robotics with rigorous math and engineering; they are well positioned to lead rather than chase.
Regulation is the other key difference. The EU AI Act, combined with GDPR and the Digital Services Act, will create demand for skills that U.S. AI degrees rarely emphasize: model governance, auditability, data provenance, documentation, and compliance‑by‑design. This opens space for European specializations in trustworthy AI, safety engineering and algorithmic auditing — not just “more model training.”
Culturally, European students and parents tend to be more sceptical of hype and more sensitive to privacy and labour protections. That does not mean Europe is safe from an AI‑branding frenzy, but it may slow the pendulum enough for more thoughtful program design. Smaller markets in Central and Eastern Europe, the Baltics or the Balkans have an additional angle: they can build compact, high‑quality AI tracks in local languages, anchored in existing strong CS or engineering faculties, and export talent globally without abandoning fundamentals.
If Europe simply copies the U.S. pattern — hollowing out CS while rushing into narrow AI degrees — it will undermine its own ambitions for technological sovereignty just as the EU is trying to reduce dependence on foreign platforms.
6. Looking ahead
Over the next five to seven years, most “plain” computer science degrees are likely to morph into something like “Computer Science and AI” or “Intelligent Systems and Computing.” The branding will change quickly; the real question is how deeply curricula will be reworked.
Watch for three signals.
First, course catalogues. Are AI programs replacing core systems, networking and theory requirements, or layering AI on top of them? A program that drops algorithms in favour of “Prompt Engineering 101” is waving a red flag.
Second, hiring behaviour. How do top tech companies, scale‑ups and deep‑tech startups evaluate AI‑specific degrees compared to traditional CS, mathematics or electrical engineering? If recruiters quietly continue to prefer foundation‑heavy backgrounds for critical roles, students who chased AI branding may discover a ceiling earlier than they expect.
Third, policy and funding. Governments in the U.S., EU, UK and Asia are all preparing large AI‑skills initiatives. Whether they channel money into short‑term reskilling courses, or into serious long‑term academic reform, will shape the next generation of programs.
The biggest open question is whether universities treat AI as a horizontal literacy — something all students should master at some level — or as a narrow vertical ghettoized within engineering schools. The institutions that get this right will produce graduates who can not only operate AI tools but also question, regulate and redesign them.
7. The bottom line
Students ditching classic CS for AI majors are not irrational; they are reacting to very real labour‑market signals. But if universities respond by stripping out the foundations that made modern AI possible in the first place, we will end up with a generation fluent in prompts but lost in the stack beneath them. The smart move — for students and institutions, especially in Europe — is not to choose between “CS” and “AI,” but to insist on both. When you think about your own learning or hiring, are you optimising for today’s buzzwords or for skills that will still matter in 20 years?



