Ask HN: Can we just admit we want to replace jobs with AI?
60 points
6 hours ago
| 35 comments
| HN
I think it is best to be just straightforward about this.

With open models like DeepSeek-R1, Llama, Qwen, etc and closed source models like o1, o3, Claude etc and even OpenAI's upcoming agentic operator.

At this point knowledge work is going to be white hot with automation in the focus this year and the next.

Can we just be real and just say we aren't advancing humanity with AI, it is just focused on automating out jobs.

OpenAI even admitted that their definition of AGI is:

"By AGI we mean highly autonomous systems that outperform humans at most economically valuable work." (0)

I knew this for the longest time, but I still see and talk to researchers in AI companies that still believe they are furthering humanity with AI and all this optimism of the future. While I like to be optimistic about it, just tell the truth plainly that everybody hasn't actually thought about what happens when AGI comes.

The more we just admit this to ourselves the more people can be prepare for when AGI inevitably comes.

Otherwise everybody will prepare to fail.

(0) https://openai.com/index/how-should-ai-systems-behave/

gnfargbl
5 hours ago
[-]
But how is an individual supposed to "prepare" for AGI?

The outcome of further automation will be to move even more capital under the control of an even smaller number of hands. It's that increasing inequality which is the problem, not AGI.

reply
weatherlite
5 hours ago
[-]
You're right. It's tough, I'm not gonna tell you otherwise. I think there are 3 main things most people should try to do:

1) Mental preparation - understanding your current career is finite (be it 5 more years or 20, its not gonna last forever) and broaden and deepen other things in your life such as family, friends and hobbies. I've contemplated the loss of my career hundreds of times already if not thousands, I do it almost on a daily basis (not for hours, just a few moments of a passing thought). It's kind of the opposite of denial and it seems helpful for me to do it.

2) Financials - an obvious one. We should all accept that quite possibly our standard of living will be lower 10 years from now. It's not a certainty, but the likelihood is high enough that we need to be more frugal - which means both trying to save money as much as we can and also learning to enjoy things that don't cost a lot of money.

3) Possible career pivot - there will be opportunities. Some parts of the labour market are starving for employees and I don't mean just the trades - there's a constant need for nurses, teachers, care etc. The problem is these jobs are notorious for burning their workers out; I don't have a solution for that.

reply
piva00
4 hours ago
[-]
> 3) Possible career pivot - there will be opportunities. Some parts of the labour market are starving for employees and I don't mean just the trades - there's a constant need for nurses, teachers, care etc. The problem is these jobs are notorious for burning their workers out; I don't have a solution for that.

These jobs are also not scalable and not directly profitable, they require money from other sources to even exist at a large scale. In this potential AI-dominated world where there is less money being earned by individuals, consequently less taxes being paid, who exactly is going to be paying for nurses, teachers, care, etc.?

reply
weatherlite
3 hours ago
[-]
Good question. My hunch is most functioning governments will step in and create these jobs - because they are badly needed. .Even now at least half of economic activity is government created in many developed countries - so it will be more like 80%. But I have no crystal ball, I'm just speculating.
reply
piva00
3 hours ago
[-]
For that to happen we would need a paradigm shift on the main ideology behind most functioning governments: neoliberalism.

It's not in the current sociopolitical/economical zeitgeist for governments to step in and create jobs, since the 80s we've moved towards privatisation, even very socially oriented governments like Western/Northern Europe are in it. To undo this flow into the other direction will take another generation or two of people under a new ideology voting for it.

We've been fed neoliberal policies for way too long, the contemporary Western world has been molded in it, I don't believe such a huge shift will happen fast enough if AI does actually develop that fast to replace jobs. We will live through a limbo of pain until newer generations fight against it, usually that only happens when our collective pain is way above the uncomfortable threshold.

> Even now at least half of economic activity is government created in many developed countries

I wasn't aware of this fact, where could I find statistics about it (if you have sources easily, if not I can do my own research).

reply
weatherlite
1 hour ago
[-]
As far as I know its called government expenditure as share of gdp https://www.imf.org/external/datamapper/exp@FPP/USA/FRA/JPN/...

Look at countries like France, quite crazy.

reply
pizza
5 hours ago
[-]
Maybe something like dividends for all citizens? Either way, atoms will still matter where no number of bits will fix your problem. It's also possible that AI will introduce unimaginably profitable ways for non-technical people to give others value, though, I wouldn't bet on it, or even that does pan out I can't see it becoming particularly widespread.
reply
spiderfarmer
5 hours ago
[-]
Most people I know that learned a classic trade (plasterers, tilers, builders in general) are making a lot of money due to shortages that will not be resolved anytime soon due to structural problems in our educational system. That is my plan B. So I don’t worry about AGI.
reply
globalise83
5 hours ago
[-]
If you are right, a modest investment in an index fund of shares covering tech companies in the USA should generate enough returns over the next few years to retire in comfort.
reply
weatherlite
5 hours ago
[-]
They will have very good earnings if this happens but I'm not expecting them to become 10x their current market cap in a few years. So yeah I put some money on the index but retire comfortable sounds too good to be true to me.
reply
fragmede
3 hours ago
[-]
sell all your unnecessary worldly possessions, buy a landstead, take up farming.
reply
gregjor
6 hours ago
[-]
Who hasn’t admitted this already, either outright or by their actions?

Going back to the industrial revolution the people who control capital and production have openly sought to replace human labor with automation. Nothing new, just that this time it threatens “knowledge workers” and the techie HN crowd. If you hear someone claim that they want to “advance humanity” with automation they profit from you can safely assume they don’t mean that sincerely.

reply
rustcleaner
5 hours ago
[-]
As for why people losing economic power is a bad thing, see CGP Grey's Rules for Rulers videos. AI is about to make a whole bunch of people superfluous, and superfluous people might as well be rhinos on a safari as the techno-war overlords cruise by in their jeeps. It means our societies will degrade into natural resource rich but skill poor countries: disenfranchised poor hordes kept in check by the zoo keepers.

Are you ready for your Brave New World? :^)

reply
Gud
2 hours ago
[-]
I would absolutely embrace AI if the means of production was owned collectively, not by the few privileged.

It seems to me, that the path we are on will lead us closer to a Terminator-like future, and less likely to lead to a positive future.

Our leadership are greedy fools.

reply
Tade0
5 hours ago
[-]
My position is that the demographic collapse in countries with a near 100% literacy rate is going to more than make up for any jobs replaced by AI.

It is said that AI will make 200 million jobs redundant. China's workforce alone shrunk by 80 million during the previous decade according to their 2020 census.

Overall I think we need to increase efforts with AI, particularly make it more energy efficient, as currently it's simply expensive to run, if we wish to not bear the consequences of having a globally ageing literate population.

reply
RachelF
5 hours ago
[-]
Yes, the idea of businesses is to reduce costs. Labor costs are a big expense.

It's harder to do than it looks. Offshoring to places with lower wages has kept wages for many white- and blue-collar workers in the West from growing. But it is hard to do correctly.

I suspect AI or AGI or AI with agents will take a longer time to actually work reliably than many suspect.

reply
graemep
4 hours ago
[-]
I agree, but, like offshoring, it does not have to work all that well to reduce demand by enough to have a significant impact on incomes.
reply
physicsguy
5 hours ago
[-]
It feels to me like there'll start to be people retraining sooner rather than later. If you feel your job is threatened, you're likely to look at options that can't be easily displaced. I did a Physics degree and PhD and my 'backups' are going to do something in the very very regulated nuclear industry (currently crying out for people in the UK) or going into teaching.

I think the people who will be most easily displaced are those that don't have an additional combined skillset. I see a lot of software engineers with a CS background whose skillset is only writing software. I think people from mixed backgrounds are likely to do better in a big disruption since they've got another thing to jump to, perhaps in a different industry.

reply
oytis
4 hours ago
[-]
If an AI can do "most economically valuable jobs" retraining won't help. Unless you want to retrain for something that is not economically valuable enough to spend compute on it, and has having a human body as a non-negotiable prerequisite.
reply
physicsguy
1 hour ago
[-]
I chose those examples specifically because they’re difficult to replace a physical person there
reply
jdietrich
5 hours ago
[-]
None of this is news to blue collar workers. We've spent the last 300 years replacing skilled manual workers with machines.
reply
oytis
4 hours ago
[-]
I understand why _they_ would want to replace jobs with AI, but not why _we_ would want it. The democratic world as we know it is largely based on labour, and especially intellectual labour, having a lot of leverage. If it is no longer the case, most of people are not needed any more, and their opinion doesn't matter, as simple as that.

Not sure how you (as a worker) want to prepare for that - maybe by becoming depressed, quiet and atomized in advance. US leaders (political and big tech) seem to be preparing exactly for that now, but my assumption is you are not one of them.

reply
Agraillo
3 hours ago
[-]
Why can't other AI tasks be also in focus? Like having fun with some creativity. I remember that I once created a story with an LLM when we switched sides. I still remember some of the twists and it's interesting that I no longer can attribute them to either of us :)
reply
EZ-E
4 hours ago
[-]
I still don't think AI will displace a lot of jobs. Am I the only one left? I'm sure it had had impact in copywriting/translation but I don't see a tsunami.
reply
akmarinov
5 hours ago
[-]
I thought everyone already knew this.

IT, doctors, lawyers will be mostly gone. Probably most office worker jobs that I’m not thinking of.

Best thing to do is become a nurse, that’ll be bulletproof until AGI makes progress on robotics.

reply
scarface_74
1 hour ago
[-]
How is AI going to replace doctors?
reply
beardyw
4 hours ago
[-]
If we assume that very many people will be unemployed there will need to be a complete change in political focus towards supporting society rather than the individual. That is, wealth will need to be directed to those who need it rather than those who can grab it. The USA is one of the least politically prepared for that, but contains the wealth to resolve it. We will live in interesting times.
reply
graemep
4 hours ago
[-]
> That is, wealth will need to be directed to those who need it rather than those who can grab it.

That is a huge political change that will be strongly = resisted. Do you recognise the quote "to each according to his needs"? It is what you are suggesting, and the very opposite of the thinking that dominates the world (not just the US, these days).

> We will live in interesting times.

Certainly, but it will be very turbulent, and very likely violent. Even previously stable societies are clearly a lot less stable if you look at the political changes and the hostility between different political groups (the move from those on the other side being just "wrong" to being "evil" - "worse than Nazis" to quote a prominent British politician). I would say we live in scary times.

reply
beardyw
4 hours ago
[-]
> Do you recognise the quote

Yes, of course I did see the parallel, but I think raking over old quotes doesn't help keep a clear mind. Wikipedia - February Revolution describes Russia's failure to

...modernise its archaic social, economic, and political structures while maintaining the stability of ubiquitous devotion to an autocratic monarch.

Sounds familiar?

reply
graemep
3 hours ago
[-]
I agree about not raking over old quotes, but people will. Any attempt to do what you are suggesting will be labelled "communism" or similar.

Of course communism is not the only value system I remember an Yes Minister episode (I cannot recall which one) in which some one quotes that and Hacker guesses the source of the quote is Jesus. Not an unreasonable guess.

The problem is that whatever values you base the justification of the change on, they have to be radically different from that of the current political consensus.

> Wikipedia - February Revolution describes Russia's failure to

The problem is what that lead to. Not something I want to live through, or want my children to live through.

reply
beardyw
3 hours ago
[-]
> what you are suggesting will be labelled "communism" or similar

I am definitely not suggesting communism. What I laid out was the problem and how, without insurrection, it could be mitigated.

reply
graemep
39 minutes ago
[-]
I am not saying it is communism. I am saying it will be labelled communism by those who oppose it.
reply
hiAndrewQuinn
6 hours ago
[-]
I don't think anyone serious was ever claiming they weren't trying to "replace jobs with AI", except maybe as a PR thing. Indeed this claim falls straightforwardly out of the numbers: Human labor is the most expensive component of most industries in the developed world, so naturally we'd like a subsitute. It's a much more obvious near-term target than, say, innovating over a decade or two to reduce the global price of commodity rice by a cent per kilogram.
reply
gunian
4 hours ago
[-]
it's just a new evolutionary/eugenic variable. the people who can get out, switch to trades and manual labor, learn to fish, minimize costs etc will pass the test and survive

then robots will become a thing and the same for manual labor i think the non manual labor work class was chosen because they have demonstrated problematic sentiments to the powers that be

a lot of people might die or UBI may become a thing. migration to less developed areas might happen although i doubt the powers that be will allow cross border movements they love their fiefdoms

the barons will always be there unless there is a skynet type event that wipes out humanity same shit different day if you a peasant like me and not a good dev without any dev education or connections most probably will die it is what it is

reply
xgstation
5 hours ago
[-]
from Company perspective, I think the answer is yes

but from individual perspective I don't think that is the case. Since AlphaGo first time was released and beat world-class players, have all these players gone? not really, but it even promotes more people study Go with AI instead.

As a software engineer myself, am I enjoy using AI for coding? yes, for many trivial and repetitive works, but do I want it to take my coding works fully away such that I can just type natural languages? My answer is no. I still need the dopamine hit from coding myself, either for work or my for hobby time, even I am rebuilding some wheels other folks have already built, and I believe many of ours are the same.

reply
Havoc
5 hours ago
[-]
> have all these players gone?

The guy that got beaten literally decided to retire immediately after explicitly because AI displaced him

> On 19 November 2019, Lee announced his retirement from professional play, stating that he could never be the top overall player of Go due to the increasing dominance of AI.

I do get your point though that overall player count is still fine

reply
xgstation
4 hours ago
[-]
I can imagine how depress Lee felt when being beaten by AI the first time, but looking at the bright side, we see Shin Jin-seo as the rising star in the AI-era of Go by leveraing AI to help him training.
reply
smgit
5 hours ago
[-]
It's not going to happen over night. But even if something radical happens, it will probably play out like Marshall Brain describes in Manna - https://marshallbrain.com/manna1

Not all bad.

reply
Espressosaurus
4 hours ago
[-]
The first half feels far too realistic.

The second half feels like wish fulfillment.

I honestly don't see how you get there from here.

reply
actionfromafar
5 hours ago
[-]
All bad except for a tiny enclave of freeholders in Australia or something?
reply
ergocoder
5 hours ago
[-]
Yes. Who doesn't want to admit that?

This is not just for businesses but for everything in life.

For example, we all buy robot vaccuums, right? People even rearrange furniture to make sure it is compatible with the robot vaccuums. Everyone wants it to do more and be more reliable.

reply
randoments
4 hours ago
[-]
In the whole history of the world, humans have always strived for automating their job and yet, here we are with a single digit unemployment rate.
reply
downboots
4 hours ago
[-]
If the perceived value of laying you off exceeds the perceived value of keeping you employed... Seems like work and value decoupled
reply
laptopdev
5 hours ago
[-]
We dont align our current actions with AGI. Rather, we align our actions with a presumption of what we think AGI is to become (assuming some inevitability).

Some people believe AGI is imminent, others believe AGI is here now. Observe their behavior, calm your anticipation, and satisfy your curiosoty rather than seeking to confirm a bias on the matter.

The tech is new in my experience and accepting claims beyond my capacity to validate such a grand assertion would require me to take on faith the word of someone I don't know or have never seen, who likely generated such a query in the first place outside the context length of Chatgpt mini.

reply
sureIy
5 hours ago
[-]
The problem is that exactly tomorrow someone might announce some technology that will revolutionize AI once again, like ChatGPT did only a couple of years ago. No one expected that. No one expected creative jobs to be taken by Midjourney.

We could hide behind "we can't predict the future" but it would be wise to get ready for that inevitability.

One day you will ask your computer to "open a word processor" and it will pull a fully-featured Word 2013 out of thin air. What will developers do then?

That day could be March 1st, 2025 or 2050. Many of us will likely still be in the jobs pool either way.

reply
jkhdigital
5 hours ago
[-]
All major technological advancements have eliminated some jobs and created others. In general, producing more with less human input is how society becomes richer. Is it a tragedy that we no longer need telephone switchboard operators, ice cutters, and lamplighters? There were surely countless mini-tragedies as the people who made their livelihoods that way struggled with stagnant wages and unemployment, but the next generation became free to pursue other more productive forms of work and accumulated greater wealth as a result.
reply
jatins
5 hours ago
[-]
> have eliminated some jobs and created others

genuinely asking: Do you see LLMs creating jobs? I see them taking 10 jobs to create one

Sure there might be a world down the line where one does NOT need jobs and everything is practically free because AI makes infinite of it. But the transition period could be quite painful

reply
slooonz
5 hours ago
[-]
Yes, but this time the automation (AI) can immediately displace humans in the newly created, "more productive form of work" jobs too. Recursively.
reply
veunes
5 hours ago
[-]
Yet the change isn’t without hardship, though. For the people who lose jobs due to new technologies, the transition can be difficult. But I agree innovation may seem disruptive in the short term for some, in the long run, it’s part of a broader process
reply
versteegen
5 hours ago
[-]
> in the long run, it’s part of a broader process

Well put. And what is this broader process? Those who say new jobs will always be created as others become obsolete think that the broader process is standing still in stream: job titles change, average wealth and comfort increase, but the structure of civilisation will never change. That we'll all still have jobs and won't become unemployable. Eternal stasis sounds absurd to me.

But I don't assume that the change that happens will be bad. And I don't think all jobs will go. I think we'll end up with a fraction of people employed and well off, such as tradespeople, while the majority will busy themselves with unpaid labour. I already do.

reply
Xenoamorphous
5 hours ago
[-]
> Is it a tragedy that we no longer need telephone switchboard operators, ice cutters, and lamplighters?

As a society? No. For those individuals that were caught up in the transition? Probably.

reply
weatherlite
5 hours ago
[-]
Well if we have machines that can think, plan and reason and also exhibit incredible amounts of infinite empathy and patience and an IQ of a PHD physicist wtf is there for us to do exactly?

Sure, we'll have work for people taking care of children and the elderly (and even this might start to fade away as we get humanoid robots that become cheaper and more capable, staff losing it and beating up children is a frequent occurrence where I'm from) but what about the tens of millions of other jobs we all do?

I'm talking about the theoritcal case AGI is in fact achieved. For now - we're not there at all.

reply
slooonz
3 hours ago
[-]
> For now - we're not there at all.

Both OpenAI and Anthropic think it’s a question of a couple of years at most. Gwern, who has a good track record of predictions, think so too. We’re pretty much here.

I arrived at the same conclusion independently. Not because I’m a genius like Gwern and Could Work at a Big Lab if I Wanted, but because it is in fact pretty obvious. People focus way too much on the current limitations of current models, not enough on their strength. The core strength of current LLMs are already superhuman (speed, memorization, ability to navigate long context) by a significant margin. Their overall ability is heavily constrained by their weaknesses (mainly planning — hallucinations is a non-story). There are known solutions to this. AlphaZero is "training to plan" and predates GPT-2, you just have to adapt it to the LLM paradigm. What did you think AlphaProof was ? An idle experiment just for fun ?

The only hope now is that there’s some non-identified and non foreseen weakness, where LLMs currently sit well below human-level but has been obscured by the lack of planning, that significantly limit capabilities of the planned models in the same way that planning limits current models, and that has no obvious solution. But at that point that’s just Copium.

Or that we collectively fucking wake up, realize that "we’re pretty much there" and "we don’t want this", and do something. But at that point that’s just Copium too: the governing elites got the memo and are okaying this (cue Andreessen on the Trump camp, the last Biden EO ordering to fast-track AI-scale datacenters on the Democrats camp).

The big issue is that people who don’t want this (the vast majority of people I believe) think we’re not here, and people who think we’re here want this (the labs, the two big parties).

It does not helps that usual human irrationality kicks in heavily on this topic. "I don’t like AI and its consequences" => "I enjoy disparaging AI and consuming content disparaging AI" (look at those 10 epic fails of ChatGPT !) => "I’m going to inflate their weaknesses and downplay their strength" (I can’t believe it can’t count the r in strawberry !) => "Yes AGI is Very Bad News but it’s not there, have you looked at where we’re at ?". Expect a lot of Pikachu Faces in the following months/years.

2028 was the initial timeline given by, I believe, Shane Legg. He recently said he’s on track. You better give some credence to that or wake up with some very, very nasty surprise very soon.

reply
weatherlite
3 hours ago
[-]
You could be right. I don't have enough knowledge in the field and I'm not sure 'trend predictions' work that well. Why aren't there autonomous vehicles already? Why does Waymo have to deploy them city by city? When I learned to drive I learned to drive - you didn't have to teach me how to drive in L.A , then San Francsisco, Then Pheonix etc. Is it only about Waymo being careful or are these things not yet good enough at generalizing? We will see soon enough. Personally I learned to let go - I don't have a dog in this fight. Whatever has to happen will happen, there are pros and cons for all of us in this (con: losing your job. pro: the best health care you can imagine. the best education for your kids. etc etc).
reply
slooonz
2 hours ago
[-]
> Why aren't there autonomous vehicles already?

There’s a lot of reasons. Not in the order of importance, just what in the order of what comes first in my mind :

1. While Real World Interactions (robotics, autonomous driving, factories automation,…) are somewhat parallelizable with Purely Digital AGI (games, text, videos, programming,…), it is way more easy to do AGI first and Real World Interactions second. This is why you see the Big Brains and the Big Money going to Anthropic/DeepMind/OpenAI. If you have AGI you have Waymo. So predictably, OpenAI/DeepMind/Anthropic will go faster than Waymo.

2. The source of the difficulty gap is easy to understand. It is hard to parallelize and scale experiments in the real world. It is trivial in the digital world, just takes More Money. AlphaZero is an AI engine doing dozens of millions of games playing against itself, eventually reaching super-human capabilities in chess and go. Good luck doing that with robotics/cars.

3. "I learned to drive faster" : It is unknown how much bits of priors evolution have put in the human brain (we don’t even know how genes encode priors — a fascinating question). It is certainly not zero. Evolution did that hard work of parallelizing/scaling the "learning to interact with the world" before you were even born. Hell, most of the work on this problem was probably already completed by the start of the mammalian line. No wonder you find this easy and Waymo find this hard. It is not that the problem is inherently easy and "how bad are AI are to fail this simple promble ?" It is that you are custom-tailored-built for it.

4. We have higher standards for AI than humans, and regulation reflect that.

> con: losing your job. pro: the best health care you can imagine. the best education for your kids. etc etc

The con is that humanity is going to lose pretty much any influence on the future. "losing you job" is a pretty bad way of picturing it.

It is a frustrating topic. Let me try to explain you the stakes in a few words, and let’s start with this image :

https://en.wikipedia.org/wiki/German_revolution_of_1918%E2%8...

It’s a communist militia in Berlin at the early stages of the Weimar Republic. The specifics doesn’t matter, you don’t have to judge who was right or wrong. I could have taken a picture of the proto-nazis, or the SDP, or anyone really. The story is the same.

Why are those humans here ? In the cold, in a potentially dangerous situation ? What’s going on in their head ?

"This a an important moment. I have to be the Best Person I can be, take the Best Actions I can take. If I am right, and I do this right, my actions will help better my future. It will help my family. My neighbor. My community. The World. My actions and my choices here Matter. I Matter".

Those two words, "I Matter" is I believe a fundamental requirement of what is it to be human. To my great surprise, there are people who actually actively disagree with that. "Mattering does not matter very much". Maybe you are one of those, I don’t know, I don’t know you. Those people should indeed accept and welcome the AGI. No human will matter anymore but who cares ? Great healthcare, great education, great entertainments.

But AGI being way better in all cognitive domains : Business/Economy, Policy/Politics/Governance, Science, Arts,… means exactly this : humans will no longer have any place in those domains, and this "I Matter" feeling will be lost forever.

EDIT: I forgot a point :

> and I'm not sure 'trend predictions' work that well

It’s not trend prediction. It’s engineering. Roadblocks have been identified. Solutions to those roadblock have been identified. Now they are just the phase of "implement those solutions". Whether those solutions are sufficient to go all the way to AGI is a bit more speculative, but the odds are clearly in the "yes" side.

reply
weatherlite
40 minutes ago
[-]
> Those two words, "I Matter" is I believe a fundamental requirement of what is it to be human. To my great surprise, there are people who actually actively disagree with that. "Mattering does not matter very much". Maybe you are one of those, I don’t know, I don’t know you. Those people should indeed accept and welcome the AGI. No human will matter anymore but who cares ? Great healthcare, great education, great entertainments.

Yes there will be a crisis of meaning, in fact in most secular societies there already is one (how much meaning can you derive from preparing a balance sheet or handling customer support tickets?). Some societies will deal much better with unemployment - mostly religious societies. If we can create societies of abundance (where you get most services for pretty much free due to A.I) I think we will solve the crisis of meaning with family, friends, hobbies and really good next generation T.V and computer games.

In the grand scheme of things most of us understand we don't matter at all (at least I don't think I matter in any significant way, nor does humanity as a whole imo), but we do need a reason to get up in the morning, somewhere to go and interact with society. We matter to our families and close friends if it makes you feel better.

reply
scarface_74
1 hour ago
[-]
> We have higher standards for AI than humans, and regulation reflect that.

This is the most likely reason that we won’t see widespread adoption of autonomous vehicles in the next 30 years.

Around 120 people die in automobile accidents every day and people shrug. But if 12 people died in accidents caused by autonomous vehicles in a year (yes I changed units), they will quickly be banned.

reply
purplethinking
5 hours ago
[-]
Yes, of course. But it's also inevitable and overall good. If we stagnate then we are headed towards certain doom via climate change, nuclear war, an asteroid, a vulcano eruption leading to cooling and crop failure or a pandemic. If we are to survive and thrive long term we need to become true masters of our environment, and that means we need to be smarter, stronger and more productive.

I think this is my main disconnect with the pessimists, I don't see "stop AI progress" as a valid option anyway.

reply
slooonz
3 hours ago
[-]
Yes, collective suicide means we no longer have any Climate Change issue. Calling this "a solution" stretches the definition of the word tho.

> If we are to survive and thrive long term we need to become true masters of our environment, and that means we need to be smarter, stronger and more productive.

Yes. I fundamentally agree with that vision. We want this for us. We/Us humans.

If we build AGI we won’t thrive. We won’t be smarter. We won’t be stronger. We won’t be more productive. We won’t be masters of our environment. The AI will be all of this. We’ll just be relegated to passive, helpless spectators.

reply
pydry
5 hours ago
[-]
AI does fuck all about climate change or nuclear war or any other existential threat.

I'm not all that bothered about stopping AI progress, but if capital is overwhelmingly directed towards replacing blue and white collar workers with machines while climate change and every other existential threat is starved of capital then as a species we are doomed.

It's not like this is being done for the benefit of humanity either. Ask those same profit-obsessed oligarchs (or their media outlets) who are effusive about AI's ability to create a better future what will happen if the retirement age is raised in response or even kept the same. They 180 on their techno-jobs-killing-optimism so fast you will get whiplash: https://www.forbes.com/sites/dandoonan/2024/01/30/demographi...

reply
denkmoon
5 hours ago
[-]
Do we, so far, have any economic analysis or research showing AI is actually reducing jobs?
reply
raylad
4 hours ago
[-]
Throughout human existence there have been only a few means of distribution of resources.

In hunter/gatherer days within small bands, and also to a large extent within families (but perhaps a bit less now) the method was Generalized Reciprocity[1], where people basically take care of each other and share. This was supported by the extremely fertile and bountiful forests and other resources that were shared between early people.

Later, the large "water monopoly" cultures like ancient Egypt had largely Redistributive economies, where resources flowed to the center (the Pharaoh, etc.) and were distributed again outwards.

Finally we got Market Exchange, where people bargain in the marketplace for goods, and this has been efficiently able (to a greater or lesser degree) to distribute resources for hundreds of years. Of course we have some redistributive elements like Social Security and Welfare, but these are limited.

Market Exchange relies now on basically everyone having a Job or another means to acquire money. But with automation this breaks down because jobs will more and more be taken by the AIs and then by robotic AIs.

So only a few possibilities are likely: either we end up with almost everyone a pauper and increase dole payments just up to the point where there's no revolution, or we somehow end up with everyone owning the AI resources and their output, taking the place the forests and other ancient resources played in hunter-gatherer days, and everyone can eventually be wealthy.

It looks as if, at least in the USA we are going down the path of having a tiny oligarch class with everyone else locked out of ownership of the products of the AI, but this may not be true everywhere, and perhaps somehow won't end up being true here.

[1] Stone Age Economics by Marshall Sahlins https://archive.org/details/stoneageeconomic0000sahl/page/n5...

reply
graemep
4 hours ago
[-]
"It looks as if, at least in the USA we are going down the path of having a tiny oligarch class with everyone else locked out of ownership of the products of the AI, but this may not be true everywhere, and perhaps somehow won't end up being true here."

We are already headed towards that in every major economy, and all the others I know of too. Wealth and power started concentrating before AI, so its not just AI.

The group that has the wealth and power has the power to block change. They can distract the hoi polloi with scape goats (e.g. immigrants), culture wars, doomscrolling, day to day survival and a lot more.

The current trajectory is towards a global oligarchic class.

reply
postepowanieadm
5 hours ago
[-]
Of course we do, just not our own jobs.
reply
Throw83949489
5 hours ago
[-]
> we aren't advancing humanity with AI, it is just focused on automating out jobs.

We are doing both!

You are assuming people like to work, and jobs are something good. Without jobs, people will have less stress and more free time. They will spend more time on leisure activities, with their families, and so on.

Also jobs contribute to carbon emissions, we have to maintain cities, large offices, car makes CO2 for commute...

reply
sapphicsnail
5 hours ago
[-]
Companies aren't going to assign less work hours and still pay the same wages. A small group of people will become even richer and everyone else will have to work just as hard as before.
reply
karles
5 hours ago
[-]
How will they pay the bills when the 1%-hoarders are keeping all the profits for themselves, and now that the Oligarchy is in full reign in the US, taxes and "big bad socialism" is out of the question?
reply
graemep
4 hours ago
[-]
Dole, servants to the oligarchy, policing and security (which will be more needed with a large discontented class)....
reply
Throw83949489
5 hours ago
[-]
My country had socialism for 40 years. It is highly overrated.
reply
karles
2 hours ago
[-]
Come to Scandinavia and see how it's done right.
reply
usrnm
5 hours ago
[-]
The whole idea behind computers has always been replacing people, it's the foundation of our industry. From the very first scientific calculations that used to be performed by people before computers, replacing human labour is what allowed IT to grow so fast and so big. Most people on this site enjoy their fat paychecks because one person writing code can replace thousands of people that used to do stuff manually decades ago, so drop the shocked Picachu face. AI is just another tool for doing what we've been doing forever.
reply
JimDabell
4 hours ago
[-]
Not just computers, technology in general. Humanity has spent thousands of years inventing technology to reduce human effort. We always find new jobs to do because we want to achieve bigger and better things. Arguing against AI because it takes jobs is essentially an argument against virtually all technology.
reply
revskill
5 hours ago
[-]
If you take automation and AI to the extreme, the world only needs ONE person to manage the whole robotic AI to do the rest of production. What else do you need ? More people ?
reply
wickedsight
5 hours ago
[-]
I don't think we're replacing anything. Just looking at my own workplace, we can have multiple X increases in productivity and still have enough work to do. In IT for example, we're building nice software, but some bigger project have been running for years and are nowhere near done. Accelerating this would mean we can do more projects with the same budget, not that we will slash people and keep doing the same number of projects.

Sure, this won't be true for every role and organization, but for many this will definitely be true.

reply
globular-toast
4 hours ago
[-]
What we need to admit is why we are so scared of this. Do you want to work? If nobody has to work then we are all permanently on holiday. Isn't that good? What are you afraid of here? We just get to spend all day making or finding things fun. Are people afraid because they have no meaning in their lives other than work? Do you need a boss? Are you scared because everyone will be equal and you won't be "special" any more? What is it? I'd really like to know specifically what people are afraid of.

Please just think about it a bit. Everyone seems to be thinking "I'll lose my job!" but what you need to be thinking is "everyone loses their job".

reply
ramblerman
4 hours ago
[-]
> but what you need to be thinking is "everyone loses their job".

People realize the end goal, but it's naive to think that will happen overnight.

The road there could get really messy, with a lot of wealth redistribution while a few still hang on to their jobs. Not to mention the truly dystopian scenario where a few companies hold all the compute (and power).

reply
627467
5 hours ago
[-]
Can we just admit many jobs either have very crappy parts to it or are entirely crappy and wouldn't it be nice if that could be replaced with technology?
reply
listenallyall
4 hours ago
[-]
Will technology replace your income?
reply
globular-toast
3 hours ago
[-]
My "income"? As in, the food going into my belly? The warmth of my house etc? We are already totally dependent on modern technology for these things. When was the last time you foraged or lit a fire?
reply
concordDance
4 hours ago
[-]
> Can we just be real and just say we aren't advancing humanity with AI, it is just focused on automating out jobs.

These are the same thing.

Automation does advance humanity. The reason for the current world prosperity has been lots of automation.

(There is the separate and much more concerning risk of humans going the way of the horse, but that does not seem to be what you are concerned about)

reply
qrsjutsu
6 hours ago
[-]
> the more people can be prepare

Nope. People will prepare as much as engineers care. But engineers don't care. Educating the people is tedious. It's easier to "manipulate"/direct them, which is the job of representatives, who are about status and money and being groomed by the industries, who are exclusively about money.

People are fine without AGI until they are not. That's another 15 years at least.

If you want to worry, worry about local solutions to climate change mitigation where you need old school manpower, big machines, shovels and logistics.

reply
cadamsdotcom
5 hours ago
[-]
Let’s chat about lawnmowers.

There was a time when cutting grass was done with scythes, specially shaped swords. All day back and forth swinging the blade to get grass just the perfect length. Swing, take a step, swing. It was backbreaking labor in the sun. So lawns probably were quite expensive and having your own was probably pretty flashy, or maybe they were public, provided by the state to the people, or maybe both.

Along comes the string of inventions that led to lawnmowers. Now, anyone can mow grass in an afternoon. Gone are the lawn-scything jobs! Did the sky fall then?

Of course we know how this played out. Some lawn-scythers lamented the loss of their work, but they’re forgotten to history. Other clever lawn-scythers went and trained as lawn-mowers, while a few even cleverer ones went and became lawnmower mechanics, a few became engineers so they could start to design lawnmowers, and some started lawnmower factories and employed plenty of workers making way more lawnmowers than anyone thought possible.

Lawnmowers aren’t free but they’re super cheap and getting cheaper all the time, they’re abundant, and the real cost now is labor, which is going up all the time. So what do you do? You pay a lawnmowing person to take care of your lawn while you work your high paid office job.

Whenever someone says AI is coming for the jobs, ask them which exact AI model is coming for the jobs; which tool built by which startup it is that’s coming for the jobs, and if so why are they hiring?

reply
Havoc
5 hours ago
[-]
Thing is when manual jobs got replaced we escaped to knowledge work. This time there is no equivalent.

These lawnmower/textile worker analogies fall flat because this time might actually be different

reply
cadamsdotcom
4 hours ago
[-]
Knowledge work is too broad a definition. There’s immense potential to reduce the “creative drudgery” in work - think background art, music for malls, first drafts of company comms…

Some knowledge work has more leverage than other knowledge work. We don’t have a term for this distinction right now. But the low-leverage work will get automated, leaving us with more pleasant, higher-level, more creative jobs. This is just a continuation of the trend that saw most people leave the factory and sit down at air conditioned desks.

Knowledge workers can keep up if they stop thinking of themselves as workers and start thinking of themselves as automators of knowledge work.

reply
Havoc
1 hour ago
[-]
AI will create new jobs. Good jobs too. But not enough to absorb the losses volume wise and also not accessible to all.

Some office Karen that calls google homepage „the internet“ and struggles using shift key to capitalize (true story) is not going to automate anything whatsoever in a world where even tech people struggle to keep up

Or put different yes jobs but at different level entirely. So we’re still going to be stuck with a huge societal problem of people displaced with no place to go

> But the low-leverage work will get automated

Is this accurate?

In my mind whether something can/will get automated is not linked to level of leverage it has.

reply
scarface_74
1 hour ago
[-]
We all didn’t escape to knowledge work. Half the MAGA movement was the result of “rural America” not being able to adjust to the loss of factory work and manual labor.
reply
Havoc
1 hour ago
[-]
Yeah that’s fair. Some got left behind.

My point is there is even less of a pressure relief valve this time.

reply