At the company where I work (one of the FAANGs), there is suddenly a large number of junior IC roles opening up. This despite the trend of the last few years to only hire L5 and above.
My read of the situation:
- junior level jobs were sacrificed as cost cutting measures, to allow larger investment in AI
- some analysts read this as “the junior levels are being automated! Evidence: there is some AI stuff, and there are no junior roles!”
- but it was never true, and now the tide is turning.
I’m not sure I ever heard anybody in my company claim that the dearth of junior openings was due to to “we are going to automate the juniors”. I think all of that narrative was external analysts trying to read the tea leaves too hard. And, wannabes like Marc Benioff pretending to be tech leaders, but that’s a helpful reminder that Benioff is simply “not serious people”.
The expectations for juniors, and how seniors work with them, will certainly change, but it's way too early to be making doomsday predictions.
Of course, that's easy for me to say when I'm not the one who just spent thousands of dollars and 4 years of their to land in an environment where getting a job is going to be challenging to say the least.
Maybe there was some idea that if AI actually solved software engineering in a few years you wouldn't need any more SWEs. Industry is moving away from that idea this year.
But a big part of it to me is looking at the job data[0]. If you look at devs during this period you can see that during the pandemic they hired more in early to mid 2022 but currently are lower than any other industry.
Tech loves booms and busts, with hiring and everything else. But more than anything the tech industry loves optics. The market has rewarded the industry for hiring during the pandemic and in the past year it has rewarded them for laying people off "because AI". And as the new year comes around they'll get rewarded for hiring again as they "accelerate development" even more. Our industry is really good at metric hacking and getting those numbers to keep going up. As long as it looks like a good decision then people are excited and the numbers go up.
I think the problem is we've perverted ("over optimized") the market. You have to constantly have stock growth. The goal is to become the best but you lose the game by winning. I think a good example of this is from an article a read a few months ago[1]. It paints AWS in a bad light but if you pull out the real data you'll see AWS had a greater increase in absolute users than GCloud (you can also estimate easily from the article). But with the stock market it is better to be the underdog with growth than the status quo with constant income[2].
What a weird way to optimize our businesses. You are rewarded for becoming the best, but you are punished for being the best. Feels like only a matter of time before they start tanking on purpose because you can't go up anymore, so you need to make room to go up[3]. I mean we're already trading on speculation. We're beyond tech demos pushing stock up (already speculative) and now our "demos" are not even demonstrations but what we envision tech that hasn't been built to look like. That's much more speculative than something that is in beta! IDK, does anyone else feel like this is insane? How far can we keep pushing this?
[0] Go to "Sector" then add "Software Development" to the chart https://data.indeed.com/#/postings
[1] https://www.reuters.com/business/world-at-work/amazon-target...
[2] Doesn't take a genius to figure out you'll make more money had you invested $100 in GCloud vs $100 in AWS (in this example). The percentile differential is all that matters. Being percentile punishes having a large existing userbase. You have double the percentile growth going from 1 user to 100 than from 10 million to 500 million, yet any person who isn't severely mentally incapacitated would conclude the latter is a better business.
[3] Or at least play a game of hot potato. Sounds like a collusion ring in waiting. e.g. AWS stagnates, lets Azure take a bunch of users, Azure stagnates and users switch to AWS. Gives both the ability to "grow" and I'm sure all the users will be super happy with constantly switching and all the extra costs of doing so...
"We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting. "
Seems to me the companies are mostly in a holding pattern: sure, if an important project needs more bodies, it's probably okay to hire. I suspect that lots of teams have to make do until further notice.
Are some teams using AI instead of hiring junior engineers? I don't think there's any doubt about that. It's also a trial period to better understand what the value-add is.
Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
But—at the rate AI is improving, a company that doesn't adopt AI for software engineering will be at a competitive disadvantage compared to its peers.
[1]: https://www.anthropic.com/engineering/equipping-agents-for-t...
Meta (Facebook)
2022: ~11,000 employees (13% of workforce)
2023: ~10,000 employees plus 5,000 open positions eliminated
2024: Multiple smaller rounds totaling ~100-200 employees
2025: ~3,600 employees (5% of workforce, performance-based cuts)
Total: Approximately 24,700-25,000 employees
Amazon
2022: ~10,000 employees
2023: ~17,000 employees (split between multiple rounds)
2024: Smaller targeted cuts
2025: ~14,000 employees announced
Total: Approximately 41,000+ employees
Google (Alphabet)
2023: ~12,000 employees (6% of workforce)
2024: Multiple smaller rounds, hundreds of employees
2025: Several hundred in Cloud division and other areas
Total: Approximately 15,000-20,000 employees
Apple
Apple has been an outlier among FAANG companies:
2022-2023: Minimal layoffs (hiring freeze instead)
2024: ~700+ employees (primarily from canceled Apple Car project and microLED display teams)
2025: Small cuts in sales and other divisions
Total: Approximately 800-1,000 employees (significantly less than peers)
Netflix
2022: ~450 employees across two rounds (150 + 300)
2023: Smaller targeted cuts in animation and drama divisions
2024-2025: Minimal additional cuts
Total: Approximately 500-600 employees
Overall FAANG Totals
Across all five companies over the past 5 years: approximately 81,000-87,000 workers
have been laid off, with the vast majority occurring in 2022-2023
during the post-pandemic correction period.The people that comment as such are either so disconnected from the software development process or so bought in on the hype that they are forgetting what the point of a junior role is in the first place.
If you hire a junior and they're exactly as capable as a junior 3 years later (about how far we're in now) many organizations would consider letting that employee go. The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff. Within 1-2 years if they are any good, they will not be very junior any more (depending on domain, of course). There is no such promise or guarantee with AI, and employing an army of junior engineers that can't really "learn" is not a future I want to live in as a mid-career senior-ish person.
Of course, you can say "oh, it'll improve, don't worry" but I live in the present and I simply do not see that. I "employ" a bunch of crappy agents I have to constantly babysit only to output more work "units" I could before at the cost of some quality. If I had spent the money on a junior I would only have to babysit for the first little while and then they can be more autonomous. Even if they can improve beyond this, relying on the moat of "AI" provider companies to make this happen is not exactly comfortable either.
This is only a consideration if you can pay enough to keep the junior for the long term pay off.
Companies that aren't offering Big Tech compensation find it very difficult to compete on this.
The best juniors will get a job paying more than your company can offer in 2 years. The worst juniors will remain the "still haven't progressed beyond what they could do after the first month."
In this situation, unless the company can continue to offer pay increases to match what Big Tech can offer, it is disadvantageous to hire a junior developer.
I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.
Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.
Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.
So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.
In my experience those that lack these do not have chance in tech in the first place, so save yourself lot of debt.
I don't think one can seriously argue that. This as much a meme as anything. I know it's popular to rag on devs writing inefficient software, but there's plenty of apps with functions where a user couldn't possibly notice the difference between O(n^2) and O(1). You wouldn't take the time to make everything O(1) for no speedup because someone told you that's what good code is, that's just wasting dev time.
In fact, one of the first things you learn is that O(1) can be slower. Constant time is not good if the constant is big and n is small.
I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.
There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.
Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows.
For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.
Then we blame the other group of students for not going there and picking majors where the jobs aren’t.
We need some kind of apprenticeship program honestly, or AI will solve the thing entirely and let people follow their honest desires and live reasonably in the world.
I always find hilarious that people treat transformer tech as a public good. Transformers, like any other tech out there owned by large tech companies. Short of forcing the few companies who own the top models to abide to your rule, there is no chance OpenAI is going to give itself up to the government. And even if they did, it means nothing if Microsoft/Amazon/Google/etc do not provide you with the facilities to deploy the model.
A much realistic solution is that Big Tech will collude with governments to keep certain autonomy and restrict its use only for the elites
"Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India" https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
Since the workers are hired for cost over quality, they're typically incompetent. Though many have learned to parasitize SME and support staff expertise by asking highly specific questions in an extended sequence. It's a salami-slicing strategy where the majority of the work ends up being performed by those SMEs and support staff while the incompetent workers collect the paychecks and credit. I'm pushing my teams to more aggressively identify and call out this behavior, but it's so systemic that it's an endless battle with every new project coming in the door.
Personal frustrations aside, it's very dangerous from both economic and national security perspectives for India to be building and administering so much of the West's IT infrastructure. Our entire economy depends on it, yet we're voluntarily concentrating that dependency in a single foreign nation. A Pacific conflict alone could sever us from the majority of our IT workforce, regardless of India's intentions.
Companies don't want to pay US salaries, cost of living in the US are not going down, costs of engineering talent in India is cheaper, you can hire 2 devs for the cost of 1 US dev. Why would you ever have any US engineering devs?
It won't change organically unless the costs of India engineers goes up or the costs of US engineers goes down.
Who has more control over government, the people or the 0.0001%? There is no "US", you are not part of the club.
IMO the best education and credentials come from picking interesting projects you have no idea how to do, then learn everything in your way to ship them as open source so potential employers can see your work.
If you can get a degree on a scholarship for free, wonderful, but college should be viewed as more of a hobby or a way to network, rather than a way of obtaining marketable technical skills.
I work in FAANG, none of my colleagues are dropouts.
Many BigTech founders are dropouts, but that’s a separate game altogether.
I know lots of people working at those orgs that brag about how well they get away with doing nothing of value and we all know these people (but of course not everyone is like that).
No offense but I do not feel the overwhelming majority of roles at these companies are delivering value to humanity apart from shareholders, or something most people should aspire towards in a career, and do not think most of the skills learned in these orgs are all that useful in the world outside those walls.
Also those same FAANGs are clearly aware of the above at some level and doing mass layoffs, or not replacing people who leave, and those workers are having a really hard time finding a home in the non-FAANG working world where they are expected to be highly motivated generalists.
In fact there are possibly other macro-economic effects at play:
1. The inability to deduct engineering for tax purposes in the year they were spent: "Under the Tax Cuts and Jobs Act (TCJA) from 2017, the law requires companies to amortize (spread out) all domestic R&D expenses, including software development costs, over five years, starting in tax years after December 31, 2021, instead of deducting them immediately. This means if you spend $100,000 on software development in 2023, you can only deduct 1/5th (or $20,000) each year over five years"
2. End of zero-interest rates.
3. Pandemic era hiring bloat - let's be honest we hired too many non-technical people, companies are still letting attrition take place (~10%/yr where I am) instead of firing.
4. Strong dollar. My company is moving seats to Canada, Ireland, and India instead of hiring in the US. Getting 1.5-2 engineers in Ireland instead of 1 senior on the US west coast.
Otherwise AI is an accelerator to make more money, increase profits and efficiency. Yes it has a high cost, but so does/did Cloud, every SaaS product we've bought/integrated.
AI is sucking up investment and AI hype is making executives stupid. Hundreds of billions of dollars that used to go towards hiring is now going towards data centers. But AI is not doing tech jobs.
These headlines do nothing but increase the hype by pointing towards the wrong cause entirely.
Edit: You cannot square these headlines https://news.ycombinator.com/item?id=46289160
There is almost no reason to delegate the work, especially low level grunt work.
People disputing this are either in denial, or lacking the skill set to leverage AI.
One or two more Opus releases from anthropic and this field is cooked
Frontend, backend, animations, design, infra, distributed systems engineering, networking.
Imiaslavie (imyaslavie, Russian: Имяславие, lit. 'name-praisingness' or 'name-glorification'), among critics also known as imyabozhie (Russian: Имябожие) or imyabozhnichestvo (Russian: Имябожничество), "deification of the name", and also referred to as onomatodoxy (Greek: ονοματοδοξία).. the main position of which was the statement of the indissoluble connection between the name of God as the energy and action of God and God Himself.[1]
It possible that your job is simply not that difficult to begin with?
An inexperienced junior engineer delegating all their work to an LLM is an absolute recipe for disaster, both for the coworkers and product. Code reviews take at least 3x as long. They cannot justify their decisions because the decisions aren't theirs. I've seen it first hand.
Really weird.
Now people can just search stack overflow quicker for the wrong answer, and even more confidently than ever before.
Enterprise software is different beast - large fragile [quasi]monoliths, good luck for [current] AI to make a meaningful fixes and/or feature development in it. And even if AI manages to speed up actual development multiple times, the impact would be still small as actual development takes relatively small share of overall work in enterprise software. Of course it will come here too, just somewhat later than at places like FAANG.
So in the glorious future, well only need senior devs to manage AI. No juniors! Only seniors!
No business cares about that question, just like the Onceler didn't care how many Truffula trees were left. It's not their problem. Business is business, and business must grow, regardless of crummies in tummies, you know.
(i.e. this cynical complaint is exactly the opposite of the cynical complaint about managers/directors engaging in empire building.)
Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.
Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.
I'm sure they were saying the same thing in the 1800s
what if we all just blame the youth?
I think that might fix the situation
Other than that, I am guessing junior roles will move offshore to supply the body shops where the corporate IT work has been going.
seems to translate to a 6.1% unemployment rate and 16.5% underemployment rate?
https://www.finalroundai.com/blog/computer-science-graduates...
Blame the article for using suboptimal numbers, but the "wiping out" part is definitely justified when talking about jobs for graduates
https://www.newyorkfed.org/research/college-labor-market#--:...
Computer Science is tied for fourth lowest underemployment and is the 7th highest unemployment... and is also the highest early career median wage.
That needs to be compared to the underemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... and the unemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... (and make sure to compare that with 2009).
Computer science is not getting wiped out by AI. Entry level jobs exist, though people may need to reset their expectations (note that median job being $80k) from getting a $150k job out of college - that was always the exception rather than the average.
There are average jobs out there that people with a "want to be on the coast and $150k" or "must be remote so I don't relocate" are thumbing their nose at.
If you click through to new york fed's website, the unemployment figures are 4.8% for "recent college graduates (aged 22-27)", 2.7% for all college graduates, and 4.0% for all workers. That's elevated, but hardly "wiping out".
https://www.signalfire.com/blog/signalfire-state-of-talent-r...
Starting at 2019 and saying "pre-pandemic levels" might be a bit disingenuous since that was a leap to a boom... and the bust we're seeing now.
https://www.cbre.com/insights/articles/tech-boom-interrupted
At $113B, 2019 was the third-highest year on record for VC deal volume.
2019 had the second-highest volume of “mega rounds” ($100M deals or greater)–mega rounds represented 44% of total annual deal volume.
Revenue grew by an average of 12.2% in 2019 and the total revenues of the tech giants was greater than the GDP of four of the G20 nations.
Yes, tech hiring in 2025 is down from 2019. That's a lot like saying "tech hiring is down from 2000" in 2003.And while 2019 might have been third-highest year for investment in 2020, according to this it's been surpassed in 2021, 2022, and 2024
https://kpmg.com/xx/en/media/press-releases/2025/01/2024-glo...
So why have graduate hires continued to decline since 2023? It seems funds have been diverted from junior hiring into AI investments.
However, as others have remarked, this might be a case of "AI is not taking your jobs, AI investment is taking your jobs"
Junior hiring might pick up again once the spending spree is over
It's true that a lot of things which were once junior contributor things are now things I'd rather do, but my scarce resource is attention. And humans have a sufficiently large context window and self-agentic behaviour that they're still superior to a machine.
My initial reaction would be that these people, unfortunately, got scammed, and that the scammers-promising-abundant-high-paying-jobs have now found a convenient scapegoat?
AI has done nothing so far to reduce the backlog of junior developer positions from where I can see, but, yeah, that's all in "Europoor" and "EU residency required" territory, so what do I know...
Job openings for graduates are significantly down in at least one developed nation: https://www.theguardian.com/money/2025/jun/25/uk-university-...
Plus, that decline seems specious anyway (as in: just-about visible when you only observe the top-5% of the chart), plus, the UK job market has always been very different from the EU-they-left-behind.
Was Ai also responsible for that market? This seems a bit unsupported.
During COVID we were struggling to retain good developers that just couldn't deal with the full-remote situation[1], and afterwards, there was a lull in recent graduates.
Again, this is from a EU perspective.
[1] While others absolutely thrived, and, yeah, we left them alone after the pandemic restrictions ended...
The post-pandemic tech hiring boom was well documented both at the time and retrospectively. Lots of resources on it available with a quick web search.
So, please elaborate?
2. Copilot for Windows Notepad
3. Copilot for Windows 11 Start Menu
Nobody, not even Apple was using the term "Apple Silicon" in 2010.
The first M series Macs shipped November 2020.
I'm bored.
Isn’t the sales pitch that they greatly expand accessibility and reduce cost of a variety of valuable work? Ok, so where’s the output? Where’s the fucking beef? Shit’s looking all-bun at the moment, unless you’re into running scams, astroturfing, spammy blogs, or want to make ELIZA your waifu.
Like the ability for computers to generate images/videos/songs so reliably that we are debating if it is going to ruin human artists... whether you think that is terrible or good it would be dumb to say "nothing is happening in tech".
https://www.danshapiro.com/blog/2025/12/i-made-the-xkcd-impo...
The xkcd comic is from 11 years ago (September 2014).
Also what does three prove? Is three supposed to be a benchmark of some kind?
I would wager every year there are dozens, probably hundreds, of novel technologies being successfully commercialized. The rate is exponentially increasing.
New procedural generation methods for designing parking garages.
New manufacturing approaches for fuselage assembly of aircraft.
New cold-rolled steel shaping and folding methods.
New solid state battery assembly methods.
New drug discovery and testing methods.
New mineral refinement processes.
New logistics routing software.
New heat pump designs.
New robotics actuators.
See what I mean?
I would wager we are very far from peak complexity, and as long as complexity keeps increasing there will always be opportunities to do meaningful innovative work.
2. We may be at the peak complexity that our sources of energy will support. (Though the transition to renewables may help with that.)
3. We may be at the peak complexity that humans can stand without too many of them becoming dehumanized by their work. I could see evidence for this one already appearing in society, though I'm not certain that this is the cause.
2. Kardachev? You think we are at peak energy production? Fusion? Do you see energy usage slowing down, or speeding up, or staying constant?
3. Is the evidence you're seeing appear in society just evidence you're seeing appear in media? If media is an industry that competes for attention, and the best way to get and keep attention is not telling truth but novel threats + viewpoint validation, could it be that the evidence isn't actually evidence but misinformation? What exactly makes people feel dehumanized? Do you think people felt more or less dehumanized during the great depression and WW2? Do you think the world is more or less complex now than then?
From the points you're making you seem young (maybe early-mid 20s) and I wonder if you feel this way because you're early in your career and haven't experienced what makes work meaningful. In my early career I worked jobs like front-line retail and maintenance. Those jobs were not complex, and they felt dehumanizing. I was not appreciated. The more complex my work has become, the more creative I get to be, the more I'm appreciated for doing it, and the more human I feel. I can't speak for "society" but this has been a strong trend for me. Maybe it's because I work directly for customers and I know the work I do has an impact. Maybe people who are caught up in huge complex companies tossed around doing valueless meaningless work feel dehumanized. That makes sense to me, but I don't think the problem is complexity, I think the problem is getting paid to be performative instead of creating real value for other people. Integrity misalignment. Being paid to act in ways that aren't in alignment with personal values is dehumanizing (literally dissolves our humanity).
I've had meaningful work, and I've enjoyed it. But I'm seeing more and more complexity that doesn't actually add anything, or at least doesn't add enough value to be worth the extra effort to deal with it all. I've seen products get more and more bells and whistles added that fewer and fewer people cared about, even as they made the code more and more complex. I've seen good products with good teams get killed because management didn't think the numbers looked right. (I've seen management mess things up several other ways, too.)
You say "Maybe it's because I work directly for customers and I know the work I do has an impact". And that's real! But see, the more complex things get, the more the work gets fragmented into different specialties, and the (relative) fewer of us work directly with customers, and so the fewer of us get to enjoy that.
Yes many over-rely on LLMs, but new engineers see possibilities we've stopped noticing and ask the questions we've stopped asking. Experience is invaluable, but it can quietly calcify into 'this is just how things are done.'
Bad article. Hope a human didn't write it.
AI is eating the boring tasks juniors used to grind: data cleaning, basic fixes, report drafts. Companies save cash, skip the ramp-up, and wonder why their mid-level pipeline is drying up. Sarcastic bonus: great for margins, sucks for growing actual talent.
Long term though, this forces everyone to level up faster. Juniors who grok AI oversight instead of rote coding will thrive when the real systems engineering kicks in. Short term pain, massive upside if you adapt.
I will include this thread in the https://hackernewsai.com/ newsletter.
It's like if you waited until college to start learning to play piano, and wonder why you can't get a job when you graduate. You need a lot of time at the keyboard (pun intended) to build those skills.
Even agentic computing (ie. an AI doing anything on it's own accord for tech-savy users, never mind average users) is new from this year. I would argue it's still pretty far from widespread. Neither my wife nor my kids, despite my explaining repeatedly, even know what that is, never mind caring.
I'm repeating the mantra from before, and I get that it's not useful. But no, it's not AI wiping out entry-level jobs. It's governments failing to prop up the economy.
On the plus side, this means it can be fixed. However, I very much doubt the current morons in charge are going to ...
I don’t think we’ve seen any amount of a net drop in tech jobs on account of LLMs (yet). I actually think they’re (spending on projects using them, that is) countering a drop that was going to happen anyway due to other factors (tightening credit being a huge one; business investment hesitation due to things like batshit crazy and chaotic handling of tariffs; consumer sentiment; et c)
- very few teams have headcount, or expecting to grow - the number of interview requests get has dropped off a cliff.
So BigTech is definitely hiring less IMHO.
That said, I am not sure if it's only or even primarily due to replacement by AI. I think there's generally a lot of uncertainty about the future, and the AI investment bubble popping, and hence companies are being extra cautious about costs that repeat (employees) vs costs that can be stopped whenever they want (buying more GPUs).
And in parallel, they are hoping that "agents" will reduce some of the junior hiring need, but this hasn't happened at scale in practice, yet.
I would expect junior SWE hiring to slowly rebound, but likely stabilize at a slower pace than in the pre-layoff years.
I only want to point out that evidence of less hiring is not evidence for AI-anything.
As others have pointed out, here and previously, things like outsourcing to India, or for Europe Eastern Europe, is also going strong. That's another explanation for less jobs "here", but they are not gone, they just moved to cheaper places. As has been going on for decades, it just continues unevenly.
https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
> Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India
https://news.microsoft.com/source/asia/2025/12/09/microsoft-...
> Microsoft invests US$17.5 billion in India to drive AI diffusion at population scale
When I went to Japan, it felt like all kinds of people were doing all kinds of jobs many hours into the day, whether it is managing an arcade, selling tickets at the station, working at a konbini or whatever small job. Maybe we need to not give such lofty ideas to the new generation and represent blue collar jobs as "foreigner" or "failure" jobs.
The U.S has a national security interest in completely stopping all of it. They dont, because every administration is paid not to.
Regulate tech, ban labor export, ban labor import, protect your countries from the sellout.
It's not a secret companies do not want to hire Americans. Americans are expensive, demand too many benefits like fair pay, healthcare, and vacations. They also are (mostly) at-will. H1B solves all these problem. When that doesn't work, there's 400 Infosys-likes available to export that labor cheaply. We have seen this with several industries, the last most prominent one being auto manufacture.
All that matters is that the next quarters earnings are more than the last. No one hates the American worker more than Americans. Other countries have far better worker protections than us.
I see no reason H1B couldn't be solved by having an high barrier to entry (500k one time fee) and maintenance (100k per year). Then, force them to be paid at the highest bracket in their field. If H1Bs are what it's proponents say - necessary for rare talent not found else where - then this fee should be pennies on the value they provide. I also see no reason we can't tax exported labor in a similarly extreme manner. If the labor truly can't be found in America the high price of the labor on tax and fee terms should be dwarfed by their added value.
If it is not the case that high fees and taxes on H1B and exported labor make sense then the only conclusion is the vast majority of H1Bs and exported labor are not "rare talent" and thus aren't necessary. They can come through the normal immigration routes and integrate into the workforce as a naturalized American.