top of page

Why Fear of Super Intelligent AI Is Driving Harvard and MIT Students to Drop Out

Why Fear of Super Intelligent AI Is Driving Harvard and MIT Students to Drop Out

It used to be that students dropped out of Harvard or MIT because they had built something too big to wait. Mark Zuckerberg left for Facebook; Bill Gates left for Microsoft. Today, however, a new and unsettling motivation is emerging. The best and brightest aren't just leaving for greed or ambition—they are leaving because they are afraid.

Reports surfacing in late 2025 and continuing into 2026 suggest that the fear of super intelligent AI is driving Harvard and MIT students to drop out in numbers that, while small, signal a massive cultural shift. For these students, the calculation is simple but terrifying: why spend four years studying for a future that might not exist?

This trend splits into two distinct anxieties: the fear that AI will physically end humanity (existential risk), and the fear that AI will economically end their careers before they even start (obsolescence).

The Survival Instinct: "I Might Not Be Alive to Graduate"

The Survival Instinct: "I Might Not Be Alive to Graduate"

The most jarring aspect of this trend is the sheer nihilism driving the decision-making process for some students. It isn't about getting rich; it's about survival.

The case of Alice Blair and the shift to AI Safety

Take the story of Alice Blair, a student from the MIT class of 2023. She didn't leave to launch a startup. She took a permanent leave of absence because, in her words, she was "concerned I might not be alive to graduate."

This sentiment reflects a growing conviction among the "AI Safety" community on campuses. They believe super intelligent AI (often called AGI) is imminent and poses a catastrophic risk to the species. For Blair, staying in a classroom felt like rearranging deck chairs on the Titanic. Instead, she moved directly into the industry, taking a role at the Center for AI Safety.

Prioritizing "Don't Die" over "Get a Degree"

For this cohort, the prestige of an MIT degree is meaningless if the world ends in ten years. They are trading long-term credentials for immediate agency. They want to be in the room where the "off switch" is being designed.

This drives them toward organizations like Redwood Research—a non-profit focused on making AI systems honest and safe—rather than traditional tech giants. The logic is stark: if the risk is existential, the only rational move is to work on mitigation now, not after graduation.

The Opportunity Cost: Chasing the AI Gold Rush

The Opportunity Cost: Chasing the AI Gold Rush

Not everyone is driven by the apocalypse. For another group, the fear of super intelligent AI is economic. They are terrified of missing the boat.

Fear of career obsolescence before graduation

Surveys conducted by the Harvard Undergraduate Association revealed that nearly 50% of students are worried AI will negatively impact their job prospects.

The logic here is "first mover" advantage. If AGI is going to commoditize coding, writing, and analysis, then the traditional four-year curriculum is too slow. Students fear that by the time they get their diploma, the junior roles they are studying for will be automated. They see the current moment as a "Gold Rush" similar to the internet boom of the late 90s. If you aren't building the infrastructure of the AI age right now, you will end up merely being a user of it.

The "Anysphere" and "Mercor" founders

This economic fear drives students to become founders earlier than ever. Michael Truell (CEO of Anysphere) and Brendan Foody (CEO of Mercor) represent this track. They dropped out not to save the world, but to secure their place in the new economy before the door closes. They are betting that experience in the trenches of AI development is worth more than a piece of paper from Harvard.

The "Tech Writer" Paradox: A Strategic Step Down?

The "Tech Writer" Paradox: A Strategic Step Down?

One of the most confusing details for observers is the actual jobs these dropouts are taking. Alice Blair left MIT to become a... technical writer.

Why brilliant engineers are taking "replaceable" jobs

On Reddit and other tech forums, this specific move drew skepticism. If you are afraid of super intelligent AI, why take a job (writing) that AI is famously good at? It seems counter-intuitive to leave a world-class engineering program for a role that might be the first to be automated.

However, this view misses the point of the "Safety" movement.

Proximity to power vs. technical contribution

For the dropouts motivated by existential fear, the job title is secondary to the location. Being a technical writer at the Center for AI Safety gets you inside the building. It connects you with the key researchers, policymakers, and donors who are shaping the future of AGI.

In this context, dropping out isn't about maximizing salary; it's about maximizing influence. They are betting that being a low-level human in a high-level safety org is safer than being a high-level student on the outside.

Analyzing the Data: Is the Fear Justified?

Before parents start panicking, we need to look at the hard data. The media narrative suggests a mass exodus, but the numbers tell a quieter story.

50% worry vs. actual dropout numbers

While half of Harvard's students might be worried, very few are actually packing their bags. The reports cite only a handful of specific examples—three students leaving for OpenAI, one for technical writing, and a few founders.

The fear of super intelligent AI is driving anxiety at scale, but it is currently only driving action at the margins. The vast majority of students are still choosing the safety of the degree.

The counter-argument: Why the degree is a safety net

There is a strong argument that these dropouts are making a mistake. If AI makes technical skills cheap, then "trust" becomes the most expensive asset. A degree from Harvard or MIT is, fundamentally, a trust signal. It proves you have grit, social capital, and vetting.

If AGI doesn't end the world—or if it arrives slower than the "doomers" predict—those who stayed in school will have the credentials to lead, while the dropouts might find themselves with niche experience in a volatile startup ecosystem.

Practical Advice for Students on the Fence

Practical Advice for Students on the Fence

If you are a student reading this and feeling that same pressure, take a breath. The decision to drop out shouldn't be based on panic.

Assessing the timeline of AGI

Look at the specific organizations you want to join. Are they offering a role that requires you to be there today? If OpenAI or Anthropic is offering you a research scientist role, that is a rare opportunity. If you are leaving to "figure it out" because you're scared, you are likely giving up a valuable asset (your degree) for no guaranteed return.

Alternatives to dropping out

You don't have to quit to participate. Many students are spending their summers at labs like Redwood Research or interning at AI startups. You can build the network and the skills without severing the tie to the university. The "FOMO" (Fear Of Missing Out) is real, but the university network is often the very thing that helps you survive the industry shifts.

FAQ

Q: Is the fear of super intelligent AI the main reason students are leaving Harvard?

A: It is a growing reason, but not the only one. While a vocal minority leaves due to existential risk (fear of human extinction), others leave for standard startup opportunities, fearing they will miss the economic boom of the AI era.

Q: Where are these students working after they drop out?

A: The primary destinations are "AI Safety" organizations like the Center for AI Safety and Redwood Research, or major commercial labs like OpenAI and Anthropic. Some also leave to found their own companies.

Q: Is it risky to drop out of MIT for an AI job?

A: Yes. While high-risk/high-reward, dropping out assumes that gaining immediate industry experience is more valuable than the long-term credential. If the AI industry stabilizes or slows down, the lack of a degree could be a hurdle.

Q: What is the "tech writer" controversy regarding these dropouts?

A: Critics find it ironic that students leave engineering degrees to become technical writers—a job seen as easily automated by AI. However, students argue this role gives them necessary access to the inner circle of AI safety research.

Q: Are universities doing anything to keep these students?

A: Universities are trying to adapt by integrating more AI coursework and ethics discussions, but the pace of academia is naturally slower than the industry, leading some students to feel the curriculum is outdated.

Get started for free

A local first AI Assistant w/ Personal Knowledge Management

For better AI experience,

remio only supports Windows 10+ (x64) and M-Chip Macs currently.

​Add Search Bar in Your Brain

Just Ask remio

Remember Everything

Organize Nothing

bottom of page