The way I'd read this sentiment is that the arrangement of society is ultimately arbitrary and if we could only choose a different system we could by truly free. I'm not sure if I'm reading you correctly or not. That said, my impression is that people will not really be able to get away from something like that looks like traditional jobs. The core traits seem to be group dynamics, hierarchical competition, status-attainment -- all where resources are not infinite nor are opportunities for status.
We've already had sufficient technological advances such that people would not need to do much labor, but functionally speaking I just don't think people can organize themselves into _any_ possible arrangement. I think the potential arrangements that could exist are limited by nature.
The expectation for everyone to retrain and do something else is not necessarily reasonable, especially in an environment that does not have much of a social support system for education, training, and extended periods away from the workforce.
And we all know that the market doesn't magically make replacement jobs better or the same as the previous ones.
I posit that until this point in history there has never been a time where technology would allow us to grow and distribute food for free (in terms of both financial cost and labour of time). With the rise and convergence of AI, robotics, low-cost renewable energy, advances in optimal light-biomass conversion, diminishing costs on vertical farms, and self-driving vehicles, we have within our reach a way to produce food at essentially no cost.
Think through what would happen to society and our economy if food was free for anyone, anywhere. Think about the meaning of work.
If these ideas intrigue you, beta are readers wanted, see my profile for contact.
I promise you as an anarchist agitator that is unbelievably new just even in the last couple years and precisely what usually happens prior to actual direct action.
My fellow anarchists hate the fact that Donald Trump did more for anarchist-socialist praxis than every other socialist writer in history.
https://www.bloodinthemachine.com/p/i-was-forced-to-use-ai-u...
I bookmarked the series which looks exactly like what everyone in tech is saying ISN’T happening:
https://www.bloodinthemachine.com/s/ai-killed-my-job
But I’m sure somebody will blow this off as “it’s only three examples and is not really representative”
But if it is representative…
“then it’s not as bad as other automation waves”
or if it is as bad as other automation waves…
“well there’s nothing you can do about it”
Anecdotally I was in an Uber yesterday on the way to a major Metropolitan airport and we passed a Waymo. I asked the Uber driver how they felt about Waymo and Uber collaborating and if he felt like it was a threat to his job.
His answer was basically “yes it is but there’s nothing anybody can do about it you can’t stop technology it’s just part of life.”
If that’s how people who are being replaced feel about it, while still continuing to do the things necessary to train the systems, then there will be assuredly no human future (at least not one that isn’t either subsistence or fully machine integrated) because the people being replaced don’t feel like they have the capacity to stand up to it.
I'm pretty sure we'll survive.
While there are issues that are AI specific, I don't feel as if this is one of them. This happens for many reasons, of which AI is just one. In turn, I think this means that the way to address the problem of job loss should not be AI soecific.
If it turns out that AI does not create more jobs than are lost; that will be a new thing. I think that can happen, but on a longer timeframe.
When most jobs can be done by AI, we will need a societal change to deal with that. That will be where people need a livelihood, not necessarily a job. I have read pieces nearly a hundred years old saying this, there are almost certainly much earlier writings that identify this needs to be addressed.
There will undoubtedly be a few individuals that will seek to accumulate wealth and power who aim to just not employ humans. I don't think that can happen on a systemic scale because it would be too unstable.
Two of the things that supports wealth inequality is 1) people do not want to risk what they currently have, and 2) they are too busy surviving to do anything about it.
A world where people lose their jobs and have no support results in a populace with nothing to lose and time to act. That state would not last long.
We change the world. It's not happening to you; you're doing it. You're doing it right now with your parent comment - you're not an observer on the sideline, you're in the thick of it, doing it, your every action - my every action - has consequences. Who will we be in our communities and societies?
> I have read pieces nearly a hundred years old saying this
You can read pieces 100 years old talking about famine, polio, Communist and fascist dictatorships, the subordination of women, etc. We changed the world, not by crying about inevitability but with vision, confidence, and getting to work. We'd better because we are completely responsible for the results.
Also, inevitability is a common argument of people doing bad things. 'I am inevitable.' 'Human nature is ...' (nature being inevitable). How f-ing lazy and utterly irresponsible. Could you imagine telling your boss that? Your family? I hope you don't tell yourself that.
In this instance, and probably most instances of art/craft, copywriters need to figure out what is creative again, because what is considered "creative" has changed.
I could also see this being the journey that AI customer support took, where all staff were laid off and customers were punted to an AI agent, but then the shortcomings of AI were realized and the humans were reintroduced (albeit to a lesser degree). I suspect the pendulum will swing back to AI as the memory problems are resolved though.
The sad part is that the managers deciding on using AI are the ones who rarely understand what is good public communication - that's why they were hiring someone to help them with it.
With AI they get some text that seems legit but the whole process of figuring out why&how is simply skipped. It might sometimes work but it's doubtful it builds knowledge in the organisation.
A problem I have with Brian Merchant's reporting on this is that he put out a call for stories from people who have lost their jobs to AI and so that's what he got.
What's missing is a clear indication of the size of this problems. Are there a small number of copywriters who have been affected in this way or is it endemic to the industry as a whole?
I'd love to see larger scale data on this. As far as I can tell (from a quick ChatGPT search session) freelance copywriting jobs are difficult to track because there isn't a single US labor statistic that covers that category.
This seems like an inherently terrible way to look for a story to report. Not only are you unlikely to know if you didn't find work because an AI successfully replaced you, but it's likely to attract the most bitter people in the industry looking for someone to blame.
And, btw, I hate how steeply online content has obviously crashed in quality. It's very obvious that AI has destroyed most of what passed as "reporting" or even just "listicles". But there are better ways to measure this than approaching this from a labor perspective, especially as these jobs likely aren't coming back from private equity slash-and-burning the industry.
It doesn't tell the whole story though. That's why I always look for multiple angles and sources on an issue like this (here that's the impact of AI on freelance copywriting.)
But it’s probably a great way create a story to generate clicks. The people who respond to calls like this one are going to come from the extreme end of the distribution. That makes for a sensational story. But that also makes for a story that doesn't represent the reality as most people will experience it, rather the worst case.
But we’re also seeing a lot of schlock…
Hilariously naive.
Assuming you understand what I meant: As for being naive, that's hardly true; my opinion comes from experience. When the bubble burst in the early 2000s, you saw a ton of developers looking for work. This pushed salaries down, even for senior and advanced developers.
Once AI can write proper compelling converting copy then I’ll change my mind.
You're probably using the free lobotomized versions of LLMs, and shooting a one-off short ambiguous prompt and wondering why it didn't turn out the way you imagined.
Meanwhile people spend hundreds of dollars on pro LLM access and learn to use tool calling, deep research, agents and context engineering.
https://www.bloodinthemachine.com/p/i-was-forced-to-use-ai-u...
https://news.ycombinator.com/item?id=46264119
Thanks, and thanks for bringing the article to wider attention.
I believe that good skillful writing, drawing, or coding, by a human who actually understands and believes in what they're doing can really elevate the merely "good" to excellent.
However, when I think about the reality of most corporate output, we're not talking about "good" as a baseline level that we are trying to elevate. We're usually talking about "just barely not crap" in the best case, to straight up garbage in maybe a more common case.
Everyone understands this, from the consumer to the "artist" (perhaps programmer), to the manager, to the business owner. And this is why using AI slop is so easy to embrace in so many areas. The human touch was previously being used in barely successful attempts to put a coat of paint over some obvious turds. They were barely succeeding anyways, the crap stunk through. May as well let AI pretend to try, and we'll keep trying until the wheels finally fall off.
CPS samples 60k households per month to represent ~150+ million workers. Households stay in the sample 4 months, out 8, back 4.
Copywriters will get smoothed out in the aggregate, and the definition will mask this. Even if you work one hour, you are technically employed. If you are not actively looking for work for more than a month, you are also not technically unemployed.
Unemployment data is a lagging indicator for detecting recessions not early technological displacement in white-collar niches.
The human part, turning it from slop to polished, becomes the most important part of the work, and then (and in any case) should be paid at human rates.
They can actually just hire the worst of you (who will do unpaid overtime, and let you call him a dummy when you're upset), because it's not a big deal that he's only 5x as fast as you used to be compared to your 10x as fast as you used to be. They can't even attract that much business now because the lowest end of the market completely disappeared and is doing it at home by themselves.
Prepress/typesetting work went from a highly-specialized job that you spent years mastering and could raise a family with to a still moderately difficult job that paid just above minimum wage within a single generation, just due to Photoshop, Illustrator, and InDesign. Those tools don't even generate anything resembling a final product, they just make the job digital instead of physical. In the case of copywriting, AI instantly generates something that a lazy person could ship with.
"Gamblers generate slop, businessmen sell it as 'AI-powered.'"
Something important is missing.
That is, if you're selling razor blades, you want the handle and the shaving cream to be cheap. Well then, if you're turning slop into polished, then you want the slop to be cheap. And AI makes it much cheaper.
I can see some limited scenarios in up and coming industries or strategically important industries where government job programs could be at least argued for.
The copywriting industry is clearly not either of those.
Look at how things went for the "Learn to code" workforce. They were told that software would be a valuable skill to have, and sunk a lot of time and money into fronted coding bootcamps. The job market in 2025 looks very different with Sonnet 4.5, which is particularly good at frontend code. What skills would you tell all those copywriters to go re-train in? How confident are you that won't be useless in 10, 15 years? Maybe you can say that they should have trained in other fields of software, but hindsight is 20/20.
I am not saying automation is bad, or that the jobs we have today should be set in stone and nothing change. But, there will be society level ramifications if we take some significant fraction of the workforce and tell them they're out of a job. Society can absorb the impact of a small fraction of the workforce going out of a job, but not a big one.
No, it's not, and the steep decline in quality of writing has reflected this. The industry is just committing suicide.