The five stages of losing our craft

Andrew Murphy··13 min read

Last Tuesday I was on a coaching call with an engineering manager. Let's call her Sarah (because that's her name). She was describing a situation I've now heard about forty times in the last twelve months. Her most senior developer, a guy with twelve years of experience and very strong opinions about YAML indentation (2 spaces, obviously, he's not a monster), had just refused to use Cursor. Not "tried it and didn't like it." Refused. On principle.

"He says it's a fad," she told me. "He says the code quality is garbage. It puts the curly braces on the wrong lines. He says he's not going to let a robot write his software."

I asked her how his outcomes compared to the rest of the team.

Long pause.

"That's why I'm calling you."

I've been having some version of this conversation every single week for the past six months. Sometimes it's the senior dev who won't adopt the tools. Sometimes it's the engineering manager who doesn't know how to have the conversation. Sometimes it's the CTO who's quietly terrified but can't say that out loud because they're supposed to be the one with the answers. Once it was a VP of Engineering who'd been secretly using Claude for weeks and didn't want his team to know. (Reader, he was getting more done than all of them.)

Then Nolan Lawson published "We Mourn Our Craft" and my inbox exploded.

Every leader I coach sent it to me. Half of them said "this is exactly how I feel." The other half said "this is exactly how my senior devs feel and I don't know what to do about it."

Nolan's piece is gorgeous and honest and it's the best articulation I've read of what it feels like to watch the thing you spent decades mastering become a commodity. You should read it. He's right about almost everything.

But a eulogy, no matter how beautiful, is not a strategy.

I've spent over 25 years building software and the last decade coaching the people who lead the people who build software. I've sat across from thousands of engineering leaders at this point, and what I'm watching right now is not just one feeling.

It's a process. It has stages. Stages which probably look familiar to you. And if you can name the stage you're in, you can stop being trapped by it.

1. Denial: "AI code is shit and everyone knows it"

Denial is seductive because it feels like expertise.

You review the pull request Claude generated. You spot the unnecessary abstraction layer. You find the duplicated utility function. You notice it's pulling in a dependency that does what three lines of vanilla JS could do. You feel vindicated. See?

"The machine is an idiot. I'm still needed."

And you're not wrong about any of that. AI-generated code is frequently bloated, over-abstracted, and subtly wrong in ways that only someone with real experience can catch. I'm not going to blow smoke up your arse about the current state of LLM output.

But being right about the quality is not the same as being right about the trajectory.

I'm old enough to remember when people said this exact thing about cloud computing. "You can't run serious production workloads on someone else's hardware."

They were right! In 2008.

And then they kept being right while the rest of the industry moved on without them.

Being the smartest person in a dead paradigm is not actually the flex you think it is.

The thing that worries me about this stage is that, sure, denial in a senior IC is a personal problem. But denial in a tech lead is a team problem. In a CTO? It's existential. Because it gives everyone below them permission to stop paying attention.

The engineers who get through denial fastest all had one thing in common: they actually used the tools seriously for a few weeks. Not for throwaway scripts. For real work, on real problems. Most of them came back shaken. Not because the tools were perfect, but because they were good enough to be scary.

2. Anger: "They stole our fucking craft"

If denial is the head talking, anger is the gut.

Some of Nolan's piece lives here, and so do a lot of the best engineers I know right now. The "they trained on my open source code" energy. The "I didn't consent to this" fury. Watching something you poured your soul into get strip-mined by a handful of multi-billion dollar corporations so some product manager can ship features without understanding what a database index is.

The anger is legitimate. The way these models were trained raises real ethical questions about consent and attribution and the value of creative labour. If you're angry, good. I'm angry too.

But anger without direction is just a Hacker News thread.

There are two kinds of angry engineers. The first type gets angry and checks out. They become the person in every architecture review who says "well, none of this matters anyway" with a thousand-yard stare. They're grieving, but they've dressed it up as cynicism, and cynicism is fucking contagious in a team environment.

The second type gets angry and leans in. They say: "Fine. If this is the world now, I'll be the one who makes damn sure the AI-generated code meets my standards." These are the people writing the linting rules and the architectural guardrails. They channel the rage into "I'll show you how this should actually work." Their anger has somewhere to go.

If you're a leader and you've got angry engineers on your team: your job is not to talk them out of the anger. Your job is to help them find the second path. (And for the love of god, don't send them a LinkedIn post about "embracing change." They'll just read the headline, close it, and open their profile to click the "looking for work" checkbox.)

3. Bargaining: "I'll just use it for the boring stuff"

Most of the industry is right here, right now. And honestly? I think It's the most intellectually dishonest place to be.

"I use AI for boilerplate but I write the real code myself."

"I let it generate tests but I hand-write the implementation."

"I use it for documentation but never for architecture decisions."

Cool. Where exactly is the line? And what happens when the line moves in three months?

Because the line always moves. 12 months ago the consensus was "AI can't do anything complex." Six months ago it was "AI can handle simple features but not system design." Now I'm watching agents refactor entire modules with context awareness that would have seemed impossible a year ago. Every quarter, the thing you swore you'd never delegate to the machine becomes the thing your junior devs are delegating without a second thought.

Bargaining is just denial in glasses and a false nose.

The leaders stuck here are the ones building policy based on vibes instead of strategy. They've got informal, unwritten rules about when AI is "allowed" and when it isn't.

And those rules are usually just reflections of what makes the most senior person on the team comfortable. (Funny how that works.) That's not an engineering decision. That's anxiety management.

If your organisation is in the bargaining phase, here's my challenge: write down your AI policy. Actually write it down. I'll wait. Then look at it in six months and see how much of it is already obsolete.

4. Depression: "What am I even for?"

Nobody talks about this one at standup. Nobody puts it in their Slack status. But it's everywhere. Nolan's article touches on these feelings deeply.

It's the senior developer staring at a pull request that took Claude four minutes to produce. Code that would have taken them a deeply satisfying afternoon. It's the tech lead who used to feel a spark of pride reviewing elegant solutions and now feels like a quality inspector at a factory. And it's the architect who spent decades developing taste and judgment about software design, who is now privately wondering if "taste" is just a word old people use to describe skills that don't matter anymore.

Nolan wrote about this beautifully. The feeling of "holding code in our hands and molding it like clay in the caress of a master sculptor." That loss is real. I am not going to stand here and tell you to optimise your way out of grief.

I've felt it myself. I can't tell you the exact date but I can tell you the exact feeling. I was watching an AI agent work through a problem that would have been a meaty, interesting challenge for me a decade years ago. And it solved it. Not perfectly, but well enough. Which, if we're being honest, is how most software ships anyway. And I felt something in my chest that I can only describe as the floor dropping out as the slobbering tentacles of ClawdBot Moltbot OpenClaw reached up to me.

If you're here right now, I need you to know: this is normal. You are not broken. Your career is not over. What you're feeling is the completely rational emotional response to watching a core part of your professional identity get commoditised in real time.

My writer's instinct was to put something funny there to break the tension. But it doesn't feel right. I don't want you to think I'm making light of what this actually feels like.

But, please, don't build a house in this feeling. Talk to someone about it. Ideally someone who isn't going to give you toxic positivity or tell you to "just learn prompt engineering." And then, when you're ready, keep reading.

5. Acceptance: but not the kind you think

I part ways with Nolan here.

His acceptance is beautiful resignation. "The sun rises, the sun sets, I orbit helplessly around it, and my protests can't stop it." That's amazing poetry. However, it's also a choice to be a passenger.

I don't accept that framing.

Look. Acceptance doesn't have to mean mourning your craft into a museum. It doesn't have to mean learning to love the machine or pretending the loss isn't real. It means something much more pragmatic and, I think, much more interesting.

The craft evolves. It always has.

The blacksmith didn't disappear when the factory arrived. They became the mechanical engineer. The metallurgist. The person who understood why metal behaves the way it does when everyone else just wanted to stamp out parts. The knowledge didn't become worthless. It became the foundation for a different kind of value.

You've spent ten, fifteen, twenty years learning not just how to write code, but what to build and why. You understand systems and failure modes. You understand the weird political dynamics that make a technically perfect solution completely unshippable. You know users and their incredible ability to do the must unexpected things with your software. You know trade-offs.

You know the thousand small judgement calls that separate software that works from software that actually matters and makes a difference in the lives of its users.

None of that gets automated away. Kent Beck put it better than I can: "The value of 90% of my skills just dropped to $0. The leverage for the remaining 10% went up 1000x." He's right. The trick is figuring out which 10% is going to be your 1000x.

The engineers who are doing well right now (actually well, not performing wellness for LinkedIn views) stopped identifying primarily as "people who write code." They started identifying as people who solve problems and provide value to users. Code is one of the tools they use to do that.

They got comfortable directing the work instead of doing the work, without treating that as a demotion. (This is, incidentally, the same transition every new engineering manager struggles with. We just didn't expect to have to do it twice in our careers). And they found that their years of experience make them better at using AI tools, not worse, because they know what good looks like.

I want to be careful here. I'm not selling you some "learn to love your new robot overlords" line. The loss is real and I've got no interest in minimising it. But wallowing in it forever is also a choice, and I don't think it's the one most of you actually want to make.

So about Sarah's senior dev

I asked Sarah to do something that felt counterintuitive. She'd already tried the "you need to adopt AI tools" conversation twice and it had gone about as well as you'd expect. So I asked her to have a different conversation entirely.

I told her to ask him what he was afraid of losing if he adopted the tools.

Not "why won't you use Cursor." Not "here are the productivity metrics." Just: "What does this feel like for you?"

She called me back a week later. He'd talked for forty-five minutes. She'd learnt the hard way that when you ask an engineer how they really feel about something, you should clear your calendar. About the first program he ever wrote. The debugging session at 2 AM that made him fall in love with the craft. How he'd spent his entire career getting better at something and now it felt like the scoreboard had been reset.

He wasn't being difficult. He was grieving. And nobody had given him permission to do that until she did.

They're working together now on figuring out what his role looks like going forward. It's not settled. It might never be fully settled. But he's moving, which is the only thing that matters.

The thing that doesn't get automated

Here's what I know: the people who come out the other side of this aren't the ones who write the best code. They're not the ones who adopted AI fastest either, but that certainly helps (like any new tool).

They're the ones who understand that software was never really about the typing.

It was always about understanding the problem. Navigating the humans. Making the hard calls about what to build and what to leave alone. Having the judgement to know when the elegant solution is wrong and the ugly hack is right.

That was the real craft all along. The code was just how we expressed it.

I won't tell you not to grieve. Grieve. The thing we're losing is worth grieving. But don't let the grief become the whole story. You've spent your career getting really, really good at something that matters. The expression is changing. The something hasn't.

Now get back to work. There's a hell of a lot left to build.

    Andrew Murphy

    Andrew Murphy

    Andrew has spent 20+ years debugging both code and teams. From Group Engineering Manager to startup CTO , he's translated engineering thinking into help and support for thousands of technical leaders struggling to build their leadership skills.

    Learn more about Andrew →