Getting old in tech. Is it really that bad?

Published on Apr 18, 2026

Some time ago, a batch of internal IBM emails from the mid-2010s became public through a lawsuit. In them, executives referred to older employees as "dinobabies" and talked openly about making them "an extinct species".

A later investigation by the EEOC (the US Equal Employment Opportunity Commission) found that more than 85% of the people IBM picked for layoffs between 2013 and 2018 were over 40. This wasn't a rogue manager or a bad week at a regional office. This seemed like a strategy, written down and discussed in emails.

I think about that every so often. Not because it was a shocking revelation. Most of us who have been in this industry for a while already had a feeling that something happens to developers as they age. But seeing it spelled out in a real email, from a real executive, at one of the largest and most respected software companies in the world, changes the conversation. It stops being a feeling.

If you are writing code for a living and you're in your twenties, you probably don't think about this much. If you are in your late thirties or older, you probably already do, whether you want to or not.

When does "old" actually start here

Outside of tech, most people start worrying about age discrimination around 50. In our industry, the number comes much earlier.

A widely quoted UK survey from CWJobs put the number at 29. That's twelve years earlier than the average across all other industries, which sits around 41 years old. By 38, tech workers reported that their colleagues had started to consider them "over the hill". 61% of IT workers in the same survey said older people face prejudice in their industry (the highest proportion of any UK sector).

The point here isn't the exact number. The point is that in most professions, being 38 means you are entering the middle of your career. In tech, for a lot of companies, 38 is the age where your manager starts wondering if you still have it.

A Swedish field experiment from 2019 (Carlsson and Eriksson) sent out more than 6,000 fake but otherwise identical CVs to real job openings, varying only the age of the applicant between 35 and 70. They found that the callback rate starts falling substantially in the early forties, and drops to almost nothing as the applicant gets close to retirement age. Not in tech specifically, but in a European labour market, under controlled experimental conditions. Older candidates got called back less, for the same CV!

A larger experiment in the US by Neumark, Burn and Button sent more than 40,000 applications (as triplets of young, middle-aged and older candidates with matched CVs) to 13,000+ real openings across twelve cities. Same result. Older candidates get fewer callbacks. Consistently, across industries, at scale.

There is no serious empirical debate about whether this is happening. There is debate about what to do about it.

A quite interesting comparison I can think of is ballet. Dancers also see their careers effectively end in their thirties, and nobody writes angry pieces about ballet being ageist. But there is a difference worth sitting with. A dancer's body gives out. Knees, hips, tendons, the recovery time between performances. At 33 she genuinely cannot do what she could do at 23, and everyone in that world understands why. The retirement is brutal but it has a cause you can point to.

Developers don't have that excuse. Nothing in your brain stops working at 40. Research on knowledge workers has shown, consistently and for decades, that experienced professionals perform at least as well as younger ones on almost every dimension anyone has bothered to measure. And yet we have somehow built an industry that treats its veterans the way ballet treats its dancers, without any of ballet's reasons for doing so. One profession retires its people because the body won't cooperate. The other retires them because it has decided they look tired I guess.

To be honest about it, something does change. Your attention span isn't what it was at 25. You get tired faster, you learn new things a little slower, you have less patience for reading the hundredth bad PR. By your forties most of us also have families, kids, parents who need help, a life that doesn't fit neatly around 10-hour coding sessions followed by midnight deploys. The 24-year-old with no dependents who will cheerfully work Saturdays is genuinely a different kind of employee, and anyone pretending otherwise isn't being straight.

I think this is what most of the hiring decisions in this piece are quietly reacting to, even when it doesn't get said out loud. The real question is whether the reaction is proportionate. "Older developers are slower to learn new stacks" is probably true on average. "Older developers are slower to learn new stacks to a degree that justifies not hiring them at 42" is a much bigger claim, and it's the one the market behaves as if it believes. Meanwhile, what you trade some of that raw learning speed for (judgment, pattern recognition, the ability to tell which framework will still be here in five years, the instinct to know which problem is worth staying up for) is exactly the stuff the market is worst at measuring. Every other profession has worked out how to value that trade. Tech mostly hasn't.

How HR actually filters you out

Phrases like "digital native", "recent graduate", "fast-paced environment", "hungry", "energetic" (the kind of language every one of us has seen in a hundred posts) have been shown to correlate with measurably lower callback rates for older applicants. Those phrases aren't neutral. They are a filter written in plain English, legal enough to publish, effective enough to work.

A CV showing 20+ years of experience in positions similar to the one advertised will often be filtered out as "overqualified". It sounds like a compliment. It isn't.

What happens when you lose a job at 40+

This is the part that doesn't get written about much, because it's not a comfortable read. I did some research about this, since for now I don't have a personal view (I chose to go full time on my own SaaS product for a change and see how it goes). But the first interesting stat I found ?

If you are laid off from a tech job in your forties, you will, on average, take around three months longer to find a new one than a younger developer in the same role. That number comes from a 2024 workforce analysis by Krucha, widely cited since. Three months is not a rounding error. It's a meaningful stretch of savings, of stress, and of watching your old stack go out of date in real time.

Many report that recruiters stop calling the way they used to. The tone of interview feedback changes. You get more rejections without explanation, or with very polite explanations that don't quite make sense given how the conversation went. You start to recognize a particular kind of silence.

AI didn't cause this, but it made the cover better

It would be wrong to write a piece in 2026 about aging in tech without talking about AI. It would also be wrong to blame AI for this.

What AI has done is given companies a cleaner, more defensible reason to do things they were already inclined to do. A company that wanted to shed older, more expensive engineers in 2020 had to come up with a reorganisation story. In 2026, it can say "we're adopting AI-first workflows, and the mix of skills we need is changing", which sounds forward-looking instead of just cost-cutting.

Sam Altman, of all people, has openly acknowledged this. There is some amount of what he called "AI washing" where companies blame layoffs on AI that they would have done regardless.

At the same time, and I want to be fair about this, AI does genuinely reward people who invest hundreds of hours in actually learning the tools. DHH, the creator of Ruby on Rails, has very publicly pivoted from typing all his own code to running multiple agents in parallel, and he credits the approach as exactly the kind of work experienced developers should be doing. The Fastly survey from mid-2025 found that senior developers with 10+ years of experience are shipping AI-generated code at around 2.5 times the rate of juniors. There is a real argument that AI can be a multiplier for people with judgment.

But "AI can be a multiplier for experienced developers who invest heavily in learning the tools" and "AI will save you from age discrimination" are two very different statements. The first might be true. The second is wishful thinking.

What the law says, and what actually happens

The EU has had a directive against age discrimination in employment since 2000 (Council Directive 2000/78/EC). On paper, a developer in their forties has serious legal protection. In practice, the situation is more complicated. The directive contains a well-known loophole in Article 6, which allows direct age discrimination to be justified if there is a "legitimate aim" (a phrase that has been interpreted broadly). And beyond the text of the law, there is very little public case law on tech-specific age discrimination anywhere in Europe. The IBM lawsuits in the US have no real European equivalent.

What that more likely means is not that it isn't happening here, but that most affected developers don't file. They move on, they work contracts, they leave the industry, they take a quieter job somewhere less demanding, they emigrate, they become consultants. The behaviour is documented. The complaint rarely gets lodged.

What this is, and what it isn't

I've seen a lot of pieces on this topic framed either as "it's over for older devs" or as "actually, experience is more valuable than ever". Both framings are too tidy.

The real picture is that something structurally unpleasant is happening in this industry. Ageism is documented, measurable, and present at every stage of hiring. It starts much earlier in tech than in any other industry, often around 29 or 30. It compounds after a layoff. The legal protection exists but is hard to use, and most people don't.

There are individuals who navigate all of this and do fine. I want to be honest here, because the piece would be incomplete without it: I am one of them. I have found two jobs in tech after turning 40, both of them good. The picture isn't uniformly grim. If you put in the work to stay current, and (more importantly, I think) if you learn to sell yourself properly, which most developers are genuinely bad at, you can land somewhere good. It's not impossible. It's just harder than it should be, and the things that make it harder are mostly not about you.

Developers who go deep on a domain that rewards experience (payments, healthcare, finance, industrial systems, compilers, security) often find that their grey hair is an asset rather than a liability. Consulting, contracting, and going independent are real options.

The IBM emails are the useful thing to hold onto. Not because IBM is uniquely bad, but because IBM got caught. What was written in those emails (the dismissive tone, the "dinobabies", the strategy to thin out the older ranks) almost certainly exists in other companies too. Most of the time, it just doesn't leak.

If you are in your late thirties or beyond and still writing code, I think the honest posture is this: the system is not on your side, the data is pretty clear, the legal recourse is weak, and none of that is your fault. Keep your skills current because it makes the work more fun, not because it will save you. Save money because the next gap might be longer than you expect. And when you read a piece about ageism in tech, don't dismiss it as doomerism or accept it as fate.

It's a real problem with real numbers behind it, and it deserves being looked at straight on.

Sources and further reading

Of course I used AI to do the research, hope you don't mind. Did a lot of passes and validated the resources one by one. The numbers and quotes in this piece are drawn from the following.

The IBM case

When ageism starts

How hiring filters you out

After a layoff

AI, DHH, and the senior-developer argument

Legal framework

Further reading