An Anthropic Cofounder Just Told Young People What to Study
Jack Clark, cofounder of Anthropic and the company’s head of policy, majored in English literature at the University of East Anglia. He worked as a journalist. He has no computer science degree.
He also co-founded one of the most influential AI companies on the planet, the company that built Claude.
So when Clark sat down at Semafor’s World Economy Summit this week and was asked what young people should study, his answer mattered. And it wasn’t what anyone pushing their kid into a CS major wanted to hear.
What Clark Actually Said
Clark’s advice, summarized from his Monday remarks: the most valuable education for an AI-driven economy involves synthesis across disciplines and learning the right questions to ask. He singled out literature, history, and philosophy as surprisingly relevant preparation for working in AI. He explicitly said Anthropic employs philosophers.
And on the flip side, when pressed, Clark named one thing he would steer young people away from: rote programming. Not coding. Rote coding. The kind you can look up. The kind an AI can do faster than you can type the prompt.
That distinction matters. Clark isn’t saying don’t learn how computers work. He’s saying don’t bet your career on being the person who types well-known algorithms from memory. As he put it, “some people need to know those fundamentals, but we do see that technology move up the stack.”
In plain English: the floor is rising. The AI is doing the entry-level work. The humans who remain valuable are the ones asking the questions that make the AI useful in the first place.
Why This Lands Differently Coming From Clark
This isn’t a liberal-arts professor defending the humanities. This is a cofounder of a $380 billion AI company saying that the skills that built him, and that he looks for, are pattern recognition across domains, historical literacy, narrative thinking, and good question design.
And he’s not alone. Boris Cherny, who created Claude Code at Anthropic, has said the job title “software engineer” will start to fade this year. Not because coding disappears, but because the nature of the work changes. The valuable humans become the ones directing the AI, not competing with it on syntax.
The Part That Should Shake Parents, Advisors, and Career Changers
For twenty years, the guidance to any ambitious young person has been the same: learn to code. Get a STEM degree. Computer science is the safest career path in the world.
That advice is quietly collapsing.
Between 2013 and 2023, STEM job growth outpaced non-STEM by almost three to one. CS enrollment exploded. Parents remortgaged houses to send kids to engineering schools. Then AI got good at code. A recent Anthropic analysis found AI can theoretically perform 94% of computer and math tasks. CEO Dario Amodei has publicly stated that AI could eliminate half of all entry-level white-collar jobs.
And now Clark, who sits at the center of this shift, is telling young people: the thing everyone told you was safe is not the thing that will make you indispensable.
The thing that will make you indispensable is a combination of skills that no single major teaches. It’s the ability to pull from literature and law and biology and history, frame a question an AI can actually act on, read the output critically, and make the judgment call about what to do with it.
That’s not a college major. That’s a practice.
This Is What SeedStacking Actually Is
When I started teaching this framework, I called it SeedStacking because of how the skill actually forms. You plant a seed, a small working knowledge of one tool, one domain, one kind of thinking. Then you plant another. Over time they stack, and the stack is what lets you do things no single skill can do on its own.
Clark just described that, almost word for word, in an Anthropic cofounder’s language.
“Synthesis across a whole variety of subjects.” That’s stacking.
“Knowing the right questions to ask and having intuitions about what would be interesting if you collided different insights from many different disciplines.” That’s the payoff of a stack.
Like what you’re reading? Get insights like this delivered daily.
What This Means If You’re Making a Decision Right Now
If you’re a high school senior, a college student rethinking your major, a working adult wondering what to retrain in, or a parent trying to advise a kid, here’s the practical takeaway from Clark’s remarks. Not a career roadmap. A reorientation.
The Four-Part Reorientation
1. Pick a domain you actually care about. Clark didn’t say “study the humanities because they’re safe from AI.” He said his literature degree was useful because it gave him ways of thinking about stories, history, and how humans make sense of change. Pick the thing you will stay curious about for fifteen years. Curiosity compounds.
2. Learn AI alongside it. Not as a separate track. As a multiplier on whatever you care about. A history student who knows how to use AI to synthesize primary sources is a more valuable historian. A nursing student who knows how to use AI to draft patient communication is a more valuable nurse. The stack matters. The AI layer goes on top of the domain layer.
3. Practice asking better questions. This is the single most under-taught skill in formal education. Clark said it twice. Question quality is becoming more valuable than answer retrieval. Start a daily practice of prompting AI not to give you the answer, but to interrogate the question you just asked.
4. Skip the rote. If a task can be memorized from a textbook and an AI can do it faster than you can type, don’t build your identity on being good at that task. Build it on the things AI cannot do alone, which is almost always the synthesis, the judgment call, and the “is this question even the right question” check.
A Note to Educators Reading This
Clark’s advice should change how we teach. Not overnight. Not with a new curriculum committee. But it should change what we emphasize in any course, at any level.
The students who will thrive in the AI era are not the ones who can reproduce material from memory on a timed test. They’re the ones who can walk into a room, identify what’s actually being asked, and pull from three unrelated bodies of knowledge to construct an answer.
That’s a teachable skill. But it requires us to stop grading for recall and start grading for synthesis. It requires assignments where the AI is allowed in the room, and the student’s value is what they do with the AI, not what they do without it.
If you’ve been waiting for permission to redesign, take this as it. An Anthropic cofounder just told the world what he looks for, and it isn’t what most schools are optimizing for.
The Seed
You don’t need a new major. You need a new habit.
Spend ten minutes a day with an AI model, asking better questions about something you already care about. Not to get answers. To sharpen the questions.
Do that for a year and you will have something no course syllabus currently promises: a functional, compounding, portable skill stack that works in almost any career path Jack Clark’s world creates or destroys.
That’s SeedStacking. It’s what Clark just endorsed in different words. And it’s available to you for the price of ten minutes of attention a day.
Sources
- Semafor World Economy Summit remarks by Jack Clark, Anthropic cofounder, April 13, 2026
- Fortune, “The billionaire Anthropic cofounder who majored in literature says knowing how to ask the right questions beats knowing how to code,” April 14, 2026
- Business Insider, “An Anthropic cofounder’s advice on what to study in college,” April 14, 2026
- National Center for Science and Engineering Statistics (NCSES), STEM enrollment and employment data 2013 to 2023
- Anthropic Economic Index research on AI task automation across occupations
Ready to go beyond reading and start building AI fluency?
Join the free Harvest Kernel community for practical guidance, fresh ideas, and tools that help you make AI useful in real life.
