AI doesn't replace white collar work
Posted by amarble 2 days ago
Comments
Comment by ctoth 2 days ago
Comment by kantord 2 days ago
Basically we do not rationally analyze what work can be automated and what work is forever safe. We just assume that "sexy work" is safe, and work backwards to figure out how to explain this belief to ourselves.
Comment by oytis 2 days ago
Comment by jatins 2 days ago
Comment by mikert89 2 days ago
Comment by adventured 2 days ago
The other side of that irrationality coin is 2D extrapolation: a thing happened (or a context is such N), so therefore I shall extrapolate it happening again (once or many times) into the future on a smooth line, so as to fit my bias.
Comment by 8note 2 days ago
Comment by amarcheschi 2 days ago
Comment by boshalfoshal 7 hours ago
Yeah I also don't buy this. Most white collar work _seemingly_ necessitates trust, social/human aspects, etc. because we _have_ to interact with other humans, and the way we interact with each other is lossy and often has misaligned or not explicitly stated motivations.
In other words, most white collar work _seems_ bottlenecked on people-centric things because we have imperfect information about what other people want, so we have to use soft skills (i.e, skills only real humans have) to actually figure out motivations of various stakeholders and align expectations, garner favor, etc. amongst all of them. In a world where most of the workforce is AI, I think this problem of tacit information gets largely solved, since AIs can in theory, convey their intent and losslessly send information to one another without the need to waste time "aligning."
The other thing that people argue, especially in software, is that architecture and tradeoff decisions will remain in the human realm, because apparently only people have the "taste" to pick and chose the right solutions. I also think that:
(1) this will be easily solved by AI/current LLMs, since logically there shouldn't be a big difference between designing and writing good code to designing good systems architecture, and LLMs are ostensibly already good at coding
(2) "taste" and "tradeoffs" are things that, if you had more information (once again, if you could convey most or all necessary information losslessly between everyone in your org), things that appeared to be "tradeoffs" before might just be binary solutions.
Also just practically speaking, the stated goal of AI companies is to automate all labor. They won't just sit back happily collecting checks if there are parts of the human parts of the economy which they can't automate, that's revenue that they could easily capture. Whatever people claim AI lacks today will just be added to it in 6 months, AI companies are strongly incentivized to work towards this.
And at the end of the day, work is a transaction between employees and employers. A company's primary purpose is to generate money for shareholders, and human labor is just how it gets done. It doesn't matter if I _want_ to talk to a nice coworker instead of Claude 4.6 opus. If Claude costs less than my nice worker and has the same or better output, the company will happily replace that coworker with Claude because its strictly beneficial for the company.
Comment by cadamsdotcom 2 days ago
Software engineering is falling to this trend too (somewhat)
The solution is to stop merely thinking of yourself as a software engineer and move up to the level of “manager of agents”.. but actually, managers deal with human stuff and this is fascinatingly mechanical - in fact even the unpredictability of these new tools is quite predictable. And so, a more useful framing is “software development process engineer”.
You can look at all the literature on building factories and production lines for ideas on what you’ll be doing.
You shouldn’t ever just have your agent write the software then review and ship it. You are missing massive opportunities to take yourself out of more loops over time. What self-reflection are you and the model doing to catch opportunities to improve? What is your method for codifying your acceptance criteria, so your agents can do the work to higher quality over time without you in the loop to get it there? What’s your process for continuous improvement? How do your models know what work other team members’ models are doing simultaneously so there’s less stepping on toes? Can THAT be automated so you don’t need to sit in Slack and trade “human-verbal locks” on areas of the architecture?
There’s immense room for creativity in the role of a software development process engineer.
Comment by pjmlp 2 days ago
Comment by cadamsdotcom 2 days ago
People could learn things and join the workforce!
/s
Comment by kevinh 2 days ago
Comment by formerly_proven 1 day ago
Comment by pjmlp 2 days ago
New skills mean shit when there is no job market that can take everyone.
Usually people that have such takes of yours, never had to actually fight months, years, to finally get back on track.
Naturally, when selling AI, the take is to downplay its impact on people lives.
Comment by cadamsdotcom 2 days ago
We signed up for this. YOU signed up for this. No one owes anyone a job. When the activities that create value change, move with it or get left behind.
If you prefer a vocation which has been the same for centuries that option is open to you. But to get into the software job market you’d best ask if the job you are trying to get is obsolete, and focus on fixing your skills and job search process/methodology.
The biggest question is “where is the net-new hiring?” (as opposed to backfill hiring) .. and then, if you are out of the market you have time on your hands to match skills to your answer.
Comment by pjmlp 2 days ago
Comment by palmotea 1 day ago
> People could learn things and join the workforce!
> /s
The point is to always, always blame the individuals being harmed for the structural problems they face.
Lost your job? Well fuck you if you can't afford to pay a lot of money to go back to school for years and support your family out of savings in the mean time. It's your own damn fault for not being rich enough.
Comment by jplusequalt 2 days ago
Comment by pjmlp 2 days ago
Comment by guywithahat 2 days ago
Obviously in the long run this is good, more productivity per employee is always better, but short term jobs are changing and people are likely getting laid off (or will at least have more free time)
Comment by pjmlp 2 days ago
I already do less coding than I used to do, because agency work has slowly focused on a mix of SaaS products, integrations via iPaaS, serveless or managed containers.
The whole MACH development approach mantra.
Meaning that even in development, at least in consulting, the teams have become a fraction that they used to be.
AI is the next step reducing team sizes.
Comment by njoyablpnting 1 day ago
Comment by athrowaway3z 2 days ago
The way AI replaces work is in that there is an enormous ROI to work with fewer (and smarter) people. Those social interactions are a big part of work, but they are only very rarely "the work", and they cost time. In the cases that they are required; they seem to cluster and the ROI of fewer social synchronization problems increases even more.
But that might all be wrong. I'm not confident enough to say where we'll land. I also see its possible demand will go up faster because of/and enabled by the increase in supply, and the social aspect is "the real work" to be done.
Comment by boshalfoshal 7 hours ago
As soon as a person enters the loop you add a manual sync point that probably doesn't need to be there. I think this is why you are increasingly seeing companies tell their people to be "on the loop" or "out of the loop" with their AI. The less syncing with a person, the better. And I think once this experiment runs its course, we will probably find out that human social interaction matters much less than we thought it did, especially for super transactional things like a corporate job where most of your work is done on a computer.
Comment by ipython 2 days ago
Comment by abmmgb 2 days ago
Comment by _aavaa_ 2 days ago
Comment by keiferski 2 days ago
I would have been interested in the experience and thoughts of someone whose opinions I respected, both as a social thing and to learn something.
In other words, some types of questions are aimed at 1) building a social connection with the person you’re asking and 2) because you want to know what they, specifically, think about their topic.
AI can’t really replace either of these. AIs might function as a weak social replacement for some people, but you aren’t really going to advance in your personal or professional life by making friends with Claude.
A good example of the second one are AskMeAnything type forum posts: I don’t care what some generic celebrity/famous figure thinks about something, I care specifically about what George Clooney thinks about it. The AI will always be guessing, building a model on what George has said in the past, but it will never actually say what he thinks right now.
For a more serious and contemporary example: there are dozens of videos on YouTube right now, interviews with various experts and pundits on the situation in Iran. Many of them have hundreds of thousands of views. But why would someone watch this instead of just asking ChatGPT what’s going on in Iran? Because we want to know what this particular person thinks.
Comment by ctoth 2 days ago
Does the accounts payable team keep their jobs because their manager enjoys chatting with them? Does the junior analyst stay employed because the VP values their specific personal opinion on the Q3 revenue forecast? Note the article is about work
Comment by keiferski 2 days ago
When you have X employee in a certain role, you know someone is “handling” a particular thing. With AI that isn’t really clear. Maybe you just get the same person owning the responsibilities that previously were under 3 people.
Comment by daxfohl 2 days ago
So yes, white collar jobs will be replaced, but they won't be replaced entirely.
Comment by georgemcbay 2 days ago
The unemployment rate during the peak of the Great Depression was 25%, not 100%.
Comment by boshalfoshal 7 hours ago
These are the people getting mortgages and sending kids to private school and whatnot. If their spending power suddenly drops to 0, its probably going to be pretty bad. I wonder what the housing market would look like in these cases.
Comment by woeirua 2 days ago
Comment by kakapo5672 2 days ago
Clearly, some white-collar jobs will be replaced. Hard to argue against that, given it's already beginning to happen. So the question becomes what is the eventual rate of conversion and what is the subsequent economic impact over time? I don't think anyone has a credible handle on that, except to note that it won't be zero.
Comment by lwhi 2 days ago
But who's to say that things will be 'running as they are now' for long? And who knows what a new economy will look like?
If and when that transition occurs, I think the job market will pick up.
Comment by kypro 2 days ago
By providing productivity tools you do effectively replace jobs because there's only so much of a good or service a person will want to consume.
For example, just because a game dev studios can make 10x more games with AI, this doesn't mean the industry will make 10x more money unless demand for video games increases. Instead what is likely to happen as the cost of making games reduces is that the price of games for consumers will drop too as competition increases, which will turn hurt game dev profits, so game dev studios will likely have to be 10x smaller in the future – even if there's still technically people working in the industry.
However when the work of agricultural workers became increasingly automated there were lots of other industries people could work instead, at the time that was factory work, and although the details will be different, I'm sure to some extent this will happen with white collar work too. But the question I'd ask today is what is that alternative source work, and is it as good as white collar work?
Our economy went from, farming -> factory work -> office work. I strongly suspect the next step will be more people working in manual labour jobs and working in servant type roles. It's hard to see where else the demand will come from.
Comment by pjmlp 1 day ago
Comment by Bratmon 2 days ago
Comment by est31 2 days ago
If block succeeds, we'll see more layoffs of that kind, probably even more extreme ones. You are not top senior level employee? Out. You don't single handedly cause 30% of the AI spend on your 15 person team? Out.
People say how in five years there won't be seniors because one stopped junior hiring... in five years the seniors won't be needed either. Already today, we have single person billion dollar exits, high schoolers making millions from food apps. This is thanks to LLMs.
The technology is there to replace most of the white collar work, it's just not applied enough yet. The economic system needs to adapt to not having labor being such a big redistributor.
Comment by daxfohl 2 days ago
Comment by tossandthrow 2 days ago
I have started to say that it will be irresponsible for people to. Manually write code in a year or two from now - and I am setting the systems I work for up to that.
It will happen sooner than later.
Already now I can not compete with agentic programming.
Comment by ibejoeb 2 days ago
Single person, or single founder? I guess there's n0tch, but he hired people when he started making money. (There may very well be truly solo cases that I don't know about.)
A few others have commented that the job becomes a kind of hybrid. I already think of it like that. If you're a person who can talk to a client and then immediately implement something to solve a problem, that's still going to be part of the process for a while. The sales cycle is still going to be competitive, whether it's based on timing or insider connections. Software people are going to have to start thinking of themselves as small firms; you have to go close a deal and then your agent army can help you deliver.
Comment by est31 2 days ago
Comment by akKjans 2 days ago
The block layoffs were due to years of over hiring.
> Already today, we have single person billion dollar exits
It was nowhere near that much, and this was more a coordinated marketing move by OpenAI than an organic process.
> high schoolers making millions from food apps
This app is a sign of the massive bubble we’re in. The developer should be ashamed to make people think they could estimate calories from an image.
There’s trillions of dollars behind these AI companies succeeding. A lot of the hype you’re seeing is paid for. If you’re reading news articles, blogs, etc and not digging any further you’re being manipulated.
Comment by Flavius 2 days ago
Comment by brtkwr 2 days ago
Real life Battlestar Galactica would be pretty sweet.
Comment by brtkwr 2 days ago
The chain of operation never ends either. Every AI system needs someone to run it. Whatever runs it needs to be built and maintained. Follow that chain as far as you like — human agency doesn't disappear, it scales up. The universe is not running out of things that need doing.
"AI will take our jobs" is not a civilisational concern. It's a failure to imagine what civilisation could actually be.
Comment by pjmlp 2 days ago
Comment by brtkwr 1 day ago
Comment by pjmlp 1 day ago
Comment by Natfan 2 days ago
Comment by brtkwr 1 day ago
Comment by inder1 1 day ago
Comment by woeirua 2 days ago
It’s not about where we are today folks (the intercept of the line). It’s about the rate of progress (the slope of the line).
Comment by lbreakjai 2 days ago
We went from the first airplane flight to walking on the moon in about 60 years. We had regular supersonic commercial flights shortly after. Applying the same logic, we should all be routinely flying to Pluto, travelling in flying cars like in the Jetsons, and commuting from Sydney to New York every day like it's nothing.
Comment by catlifeonmars 2 days ago
I agree with you that this article isn’t particularly convincing.
Comment by jatins 2 days ago
We need judgement when we can't verify/prove that the answer is correct so we need a human we can trust. For example in author's example the pandas snippet is verifiably correct and I don't really care about judgement in that case. When there is a verification/test that gives a clear pass/fail to AI, the AI can just keep throwing stuff at the wall until it's green and it's good enough for a lot of use cases.
Comment by gjsman-1000 2 days ago
We are only one major incident away from this trend reversing. Now that we have AI, regulation is less burdensome. More testing requirements, more certification requirements, more security requirements, more accessibility requirements.
Everyone keeps their jobs; the bar goes up. Whenever an industry gets better tools, we raise standards instead of making more cheap junk. We make $25K cars instead of $5K cars at 1960s engineering standards.
Comment by conception 2 days ago
Comment by 101008 2 days ago
Comment by sam0x17 2 days ago
Comment by conception 2 days ago
Comment by anonymars 2 days ago
Nor horses...
Comment by gjsman-1000 2 days ago
Comment by conception 2 days ago
Doesn’t look like a stable, growing profession. And if you compare it to the 70s-00’s it got really rough around 2010 for obvious reasons.
Comment by simianwords 2 days ago
Comment by BoxFour 2 days ago
I’m not saying it turns out bad 100% of the time, but it’s easy to forget because good professionals make it look effortless. When the skill isn’t there, though, and you're used to only seeing professional photos it becomes very obvious (and again, that's perfectly fine if you're not expecting professional photography).
Comment by wongarsu 2 days ago
Comment by simianwords 2 days ago
my bad, yeah that part is needed but as an artistic expression i don't see the point.
Comment by bombcar 2 days ago
Comment by nine_k 2 days ago
Company bosses somehow see this differently. Now that the best performers are empowered by AI, cut the worst-performing workforce, and still enjoy efficiency gains!
Comment by parineum 2 days ago
Companies massively overhired during Covid after receiving trillions in free money and are now cutting the fat after the well's run dry.
AI productivity is just the excuse to save face because people believe it.
Comment by nik282000 2 days ago
Comment by creamyhorror 2 days ago
Maybe this will change in the future if AI-run companies emerge, get backing, and outcompete existing players.
Comment by eloisant 2 days ago
What's stopping their customers from using AI directly instead of that company services?
Comment by majgr 2 days ago
Comment by andai 2 days ago
I suspect that will change as trust in automated systems increases. (For example the author seems to consider AI a source of "correctness", which implies this trust is already surprisingly high.)
Comment by tossandthrow 2 days ago
At that day it is over for consulting.
Comment by laborcontract 2 days ago
Comment by 10xDev 2 days ago
Comment by philipallstar 2 days ago
Comment by 10xDev 2 days ago
Comment by jatari 2 days ago
Comment by 7777777phil 2 days ago
Comment by andai 2 days ago
Or why he couldn't have asked a human about the NaN thing.
I know those are arbitrary examples but.. the behavior doesn't really seem to depend on the category? It might have to do more with how urgently the knowledge is needed?
Comment by simianwords 2 days ago
For example UI design can be replaced by AI. Unless UI or UX design people were bringing something like _taste_ instead of simply mechanically operating figma - they are not keeping their jobs.
I genuinely don't need to learn SQL ever in my life. I just don't need it for dashboards or analytics use. A person whose main job was to translate requirements to SQL into a dashboard and nothing else would not keep their job anymore. The person to whom they were providing the analysis to could just perform the analysis themselves using AI.
I do think that most jobs would change dramatically but for sure some of them would be eliminated completely.
Comment by tossandthrow 2 days ago
Comment by iwontberude 2 days ago
Comment by lwhi 2 days ago
All of these foundation concepts are vocabulary.
We need vocabulary in order to understand and have reasonable conversations.
Do you need to be an expert? Probably not .. but yes, we should all understand.
I think we'll develop personal moats automatically. Some people don't are naturally uninquisitive. They'll be most at risk.
Comment by simianwords 2 days ago
Comment by simianwords 2 days ago
sql is a common interview question, like joining and transformation etc. if its so simple maybe they shouldn't be asking this.
Comment by thereisnospork 2 days ago
Comment by h4kunamata 2 days ago
OP clearly does not have a whilte collar job.
There are cases and cases of IT folks being replaced by AI because companies think that AI is better than humans on everything.
Comment by 10xDev 2 days ago
Comment by bediger4000 2 days ago
Comment by 10xDev 2 days ago
Comment by deterministic 2 days ago
Comment by bufordtwain 2 days ago
Comment by mfrankel 2 days ago
Main Points, in Order of Importance
1. Most White Collar Work Is Relationship-Based, Not Transactional The central claim. A dominant share of workplace "questions" aren't requests for correct answers -- they are social, trust-based exchanges where the relationship and the advisor's judgment are the actual product.
2. Two Kinds of Question-Answering That Keep Getting Conflated The foundational distinction. Transactional questions have a correct answer and an imminent need. Relationship-based questions use the question as a pretext for social exchange, shared perspective, and felt understanding. AI handles the first well; it cannot substitute for the second.
3. AI Cannot Replace Trust and the Weight We Give to Respected Opinions Even a correct AI answer carries less weight than advice from someone whose judgment you trust. This isn't irrational -- it reflects that the value in consulting, advising, and managing is partly in the relationship itself, not just the information delivered.
4. Strategy Consulting as the Illustrative Case A concrete test domain. Buyers of consulting aren't purchasing correct answers; they want advice from trusted people, catharsis in being heard, and help clarifying their own thinking. None of that is substitutable by an AI regardless of output quality.
5. Human Factors Intensify in Procedural Organizations An underappreciated corollary. In government and military contexts, lacking market feedback mechanisms, human trust and social organization become even more load-bearing, not less.
Opinion
It's a short, clear piece with a genuinely useful distinction at its center -- but it doesn't fully earn its conclusion.
The two-question-types framework is clean and rings true experientially. Most people have felt the difference between wanting an answer and wanting a conversation, and the observation that these get conflated in AI replacement debates is fair and underappreciated.
Where it falls short is in the leap from "relationship-based questions exist" to "therefore white collar work won't be replaced." The argument proves that AI can't fully substitute for trusted human relationships -- it doesn't prove that organizations will continue to pay for those relationships at current rates, or that AI won't restructure which human interactions are deemed worth paying for.
A client might still want a trusted advisor but find that one advisor supported by AI can now serve ten clients instead of three.
There's also an implicit assumption that the relationship-based component is dominant in most white collar work. That may be true in strategy consulting, but it's a significant empirical claim that the piece asserts rather than argues across the broader category of white collar work.
Comment by spaghetdefects 2 days ago