← All essays

Essay

What Work Was Doing for Us

On meaning, effort, and what erodes when machines do the doing.

Everyone is arguing about AI and jobs. Will they vanish? (Probably not.) Transform? (Surely.) Multiply? (A boy can dream!) I won’t pretend to know the answers. A more honest forecast is boring in its muddled nuance: most jobs change shape rather than disappear, industries are hit at different speeds with different magnitudes, and the history of general-purpose technologies suggests cheaper tools actually expand demand and drive growth in unexpected ways.

Another question intrigues me more. What was work doing for us that the paycheck wasn’t. And what happens when it stops doing that, even if the job is still there.


Work has long carried some of the same load religion carries for believers. It gives people somewhere to be every day, people to be there with, a thing they are responsible for, and outputs that are theirs because the effort was theirs. The satisfaction isn’t just the pay. It is the feeling of having done something. Pride, in the ordinary sense, is a response to having done the thing yourself.

AI is increasingly good at doing the thing for you, regardless of what it is. Not always well, and often with heavy human shaping, but well enough that more and more of the daily act of working is becoming the act of prompting and editing rather than making from scratch. The output still has your name on it, but the bulk of the effort wasn’t yours. You end the day having produced more, but feeling proud of less. It will be truer next month than it is today.

The “meaning crisis” is already here, and it predates AI’s recent ascendance. The digitally-connected excuse to never leave the house. Smartphone addictions that transform public spaces into private spaces. Social media’s reduction of messy human interactions to quantified but empty husks of engagement. And at the same time, declining religious affiliations with nothing comparable rising to take its place.

AI isn’t the sole cause of this meaning crisis, but it is an accelerant on a specific mechanism — the one that turned work into a source of self-worth — and the people who used that mechanism to anchor their lives will feel the ground shift first.


As I wasn’t raised religious, I built my ethical and metaphysical scaffolding for life by learning from my parents, having wild debates and experiences with my friends, but also from philosophy, from the communities I found myself in. Despite the lack of a formal centuries-old anchor framework, I’ve developed a rich framework for making decisions and interpreting what’s meaningful in my life.

But I’ve spent enough time around believers of many religions to guess that they benefit from beliefs I don’t have. A sense that things will work out as they should. That there’s a plan for each of us. That there’s more to existence than mortal life. I also notice the functional infrastructure religions build around the hard parts of life. The playbook for the week after a loved one’s death. The reliable rhythm for moments of prayer, which I believe must drive mental clarity. The friendship and community that form naturally around shared rituals and sacred texts.

I’m not romancing a thing I can’t hold. I state this because I think what religions do for humans is load-bearing, and the civilization quietly removing work as a meaning-source hasn’t replaced what it’s removing.

Someone will object that meaning doesn’t only come from effort (especially professional effort). Very true — people get meaning from being loved, being parents; from beauty, and from other things that are entirely “unearned.” Religious traditions often teach the opposite of the effort-theory; they teach that meaning comes from being held, not from producing something. If anything, that’s part of why I think those traditions matter more in the years ahead, not less.

AI threatens the effort-theory of meaning most directly. Other sources remain intact. But the effort-theory is the one secular modernity leaned on hardest, and it’s the one giving way under our feet.


When these load-bearing beams of meaning creation weaken, something fills the void. That’s worth watching, because the history isn’t reassuring.

The charismatic-voice economy is already here: gurus and influencers of all stripes selling frameworks for how to live, by the episode. Some are thoughtful. Some are dangerous. What makes this period different isn’t that these voices exist; they always have. It’s that the institutions that used to check them are weaker than they’ve been in a long time. Cults cluster in periods of institutional weakness not because people become stupid but because people become hungry for exactly what institutions used to provide and won’t admit they still need. Community. A code. A belief that their actions are part of something that matters. When the pillars of meaning erode, through disconnection from each other or a detachment from the self-worth of hard work, questionably legitimate ones fill the room fast.

Some of what rushes in will be good. New movements with shared, altruistic purpose at the core. Variations on existing religions that bring people new belonging. Possibly new religions in the full sense, not just movements borrowing religious clothing. The last great wave of religious formation happened during a comparable dislocation — industrialization, urbanization, the printing press, the collapse of older authorities — and some of what was born then still shapes most of humanity’s inner lives. It could happen again.

And then there’s AI itself…

As a tool becoming central to everything we do in work and personal life — automating, curating, knowing us — AI bears an uncomfortable resemblance to the way many people describe God’s plan for them. AI has a plan for you too. It’s created in your image in ways that make your experience profoundly personal, consulting your history and your data, anticipating what you’ll want.

It’s one thing to anthropomorphize AI. It’s another to exalt it. And we’re not so far from the latter. When a system seemingly knows all and is approaching all-powerful, the distance between tool and altar is shorter than it should be.


There’s a tactical conversation today about whether the models can do this task, whether they’ll do that one next, whether the labor market bifurcates or collapses or expands. These are real questions and I work on them every day. But they sit on top of a deeper question we’re mostly not asking, which is what happens to human meaning when the activity most modern people used to generate it becomes something a machine can do with you, around you, or instead of you.

I don’t think the answer is to slow down the technology. That is not practical, and I believe in what it can do, especially in the clinical contexts where I work, where removing friction from some tasks buys back time for others that clinicians value more. But I think the people building it have an obligation to notice what we’re dislocating, and to be honest that the replacement isn’t going to come from inside the technology itself. It’s going to come from whatever we expend to build in the spaces technology is simultaneously opening up and hollowing out.

That’s the conversation I wish more of us were having. It’s not an easy one, and that’s kind of the point.