Was having a discussion the other day with someone, and we came to the same conclusion. You used to be able to make yourself useful by doing the easy / annoying tasks that had to be done, but more senior people didn't want to waste time dealing with. In exchange you got on-the-job experience, until you were able to handle more complex tasks and grow your skill set. AI means that those 'easy' tasks can be automated away, so there's less immediate value in hiring a new grad.
I feel the effects of this are going to take a while to be felt (5 years?); mid-level -> senior-level transitions will leave a hole behind that can't be filled internally. It's almost like the aftermath of a war killing off 18-30 year olds leaving a demographic hole, or the effect of covid on education for certain age ranges.
In the past, a junior would write bad code and you'd work with them to make it better. Now I just assume they're taking my feedback and feeding it right back to the LLM. Ends up taking more of my time than if I'd done it myself. The whole mentorship thing breaks down when you're basically collaborating with a model through a proxy.
I think highly motivated juniors who actually want to learn are still valuable. But it's hard to get past "why bother mentoring when I could just use AI directly?"
I don't have answers here. Just thinking maybe we're not seeing the end of software engineering for those of us already in it—but the door might be closing for anyone trying to come up behind us.
This is especially annoying when you get back a response in a PR "Yes, you're right. I have pushed the fixes you suggested."
Part of the challenge (and I don't have an answer either) is there are some juniors who use AI to assist... and some who use it to delegate all of their work to.
It is especially frustrating that the second group doesn't become much more than a proxy for an LLM.
New juniors can progress in software engineering - but they have to take the road of disciplined use of AI and make sure that they're learning the material rather than delegating all their work to it... and that delegating work is very tempting... especially if that's what they did in college.
Hmmm. Is there any way to distinguish between these two categories? Because I agree, if someone is delegating all their work to an LLM or similar tool, cut out the middleman. Same as if someone just copy/pasted from Stackoverflow 5 years ago.
I think it is also important to think about incentives. What incentive does the newer developer have to understand the LLM output? There's the long term incentive, but is there a short term one?
Unfortunately, the use of LLMs has brought about a lot of mistrust in the workplace. Earlier you’d simply assume that a junior making mistakes is simply part of being a junior and can be coached; whereas nowadays said junior may not be willing to take your advice as they see it as sermonizing when an “easy” process to get “acceptable” results exists.
I don't mind if AI is used as a tool, but the output needs to be vetted.
You might be specifically talking about people who copy/paste without understanding, but I think it's still OK-ish to do that, since you can't make an entire [whatever you're coding up] by copy/pasting snippets from SO like you're cutting words out of a magazine for a ransom note. There's still thought involved, so it's more like training wheels that you eventually outgrow as you get more understanding.
It at least forces you to tinker with whatever you copied over.
For software, I can imagine a process where junior developers create a PR and then run through it with another engineer side by side. The short-term incentive would be that they can do it, else they'd get exposed.
That is at least for the people who don't understand what they're doing, the LLM tends to come out with something I can at least turn into something useful.
It might be reversed though for people who know what they're doing. IF they know what they're doing they might theoretically be able to put together some stackoverflow results that make sense, and build something up from that better than what gets generated from LLM (I am not asserting this would happen, and thinking it might be the case)
However I don't know as I've never known anyone who knew what they were doing who also just copy/pasted some stackoverflow or delegated to LLM significantly.
Yes, it should be obvious. At least at the current state of LLMs.
> There's the long term incentive, but is there a short term one?
The short term incentive is keeping their job.
In a previous role I was a principal IC trying to mentor someone who had somehow been promoted up to senior but was still regularly turning in code for review that I wouldn't have expected from an intern— it was an exhausting, mind-numbing process trying to develop some sense of engineering taste in this person, and all of this was before LLMs. This person was definitely not just there for the money; they really looked up to the top-level engineers at our org and aspired to be be there, but everything just came across as extremely shallow, like engineering cosplay: every design review or bit of feedback was soundbites from a how-to-code TED talk or something. Lots of regurgitated phrases about writing code to be "maintainable" or "elegant" but no in-the-bones feeling about what any of that actually meant.
Anyway, I think a person like this is probably maximally susceptible to the fawning ego-strokes that an AI companion delivers alongside its suggestions; I think I ultimately fear that combination more than I fear a straight up mercenary for whom it's a clear transaction of money -> code.
Very odd. It was like he only had ever worked on school projects assigned to him, and had no actual interest in exploring the problems we were working on.
But it can be tricky to evaluate this in the kind of structured, disciplined way that big-company HR departments like to see, where all interviewees get a consistent set of questions and are "scored" on their responses according to a fixed rubric.
This guy claimed to want to get promoted to Senior, but didn't do anything Senior-shaped. If you're going to own a component of a system, I should be able to ask you intelligent questions about how you might evolve it, and you should be able to tell me why someone cares about it.
Just pick the two you like the most.
Everybody else through my 21-year career has almost universally either been helpful or neutral (mostly just busy). If you think code reviews are just for bikeshedding about style minutia, then you're really missing out. I personally have found it extremely rewarding to invest in junior SWEs and see them progress in their careers.
It is not.
It's worth considering how aggressively open the door has been for the last decade. Each new generation of engineers increasingly disappointed me with how much more motivated they were by a big pay check than they were for anything remotely related to engineering. There's nothing wrong with choosing a career for money, but there's also nothing wrong about missing a time when most people chose it because they were interested in it.
However I have noticed a shift: while half the juniors I work with are just churning out AI slop, the other half are really interested in the craft of software engineering and understanding computer science better.
We'll need new senior engineers in a few years, and I suspect they will come from a smaller pool of truly engaged juniors today.
There are still junior engineers out there who have experiments on their githubs, who build weird little things because they can. Those people were the best engineers anyway. The last decade of "money falls from the sky and anyone can learn to code" brought in a bunch of people who were interested in it for the money, and those people were hard to work with anyway. I'd lump the sidehustle "ship 30 projects in 30 days" crowd in here too. I think AI will effectively eliminate junior engineers in the second camp, but absolutely will not those in the first camp. It will certainly make it harder for those junior engineers at the margins between those two extremes.
There's nothing more discouraging than trying to guide a junior engineer who is just typing what you say into cursor. Like clearly you don't want to absorb this, and I can also type stuff into an AI, so why are you here?
The best engineers I've worked with build things because they are truly interested in them, not because they're trying to get rich. This is true of literally all creative pursuits.
If software were "just" a job without any of the gratifying aspects, I wouldn't do nearly as good a job.
I keep hearing this and find it utterly perplexing.
As a junior, desperate to prove that I could hang in this world, I'd comb over my PRs obsessively. I viewed each one as a showcase of my abilities. If a senior had ever pointed at a line of code and asked "what does this do?" If I'd ever answered "I don't know," I would've been mortified.
I don't want to shake my fist at a cloud, but I have to ask genuinely (not rhetorically): do these kids not have any shame at all? Are they not the slightest bit embarrassed to check in a pile of slop? I just want to understand.
I'm approaching 30 years of professional work and still feel this way. I've found some people are like this, and others aren't. Those who aren't tend to not progress as far.
> embarrassed to check in a pile of slop
Part of being a true junior, especially nowadays, is not being able to recognize the differences between a pile of slop from useful and elegant code.AI provides a bar. You need to be at least better than AI at coding to become a professional. It'll take genuine interest in the technology to surpass AI and clear that bar. The next generation of software professionals will be smaller, but unencumbered by incompetents. Their smaller number will be compensated by AI that can take care of the mundane tasks, and with any luck it's capabilities will only increase.
Surely I'm not the only one who's had colleagues with 10+years experience who can't manage to check out a new branch in git? We've been hiring people we shouldn't have hired.
It's clear why people do it (more pay) but it sets up bad incentives for the companies. Why would a company invest money in growing the technical skill set of an employee, just to have them leave as soon as they can get a better offer?
You're falling for the exact same fallacy experienced by failed salesmen. "Why would I bother investing time in this customer when they're just going to take my offer to another dealership for a better deal?"
Answer: you offer a good deal and work with people honestly, because if you don't, you'll never get a customer.
What you say only works if everyone is doing it. But if you're spending resources on juniors and raises, you can easily be outcompeted and outpoached by companies using that saved money to poach your best employees.
I've started viewing developers that have never maintained an existing piece of software for over 3 years with skepticism. Obviously, with allowances for people who have very good reasons to be in that situation (just entered the market, bad luck with employers, etc).
There's a subculture of adulation for developers that "get things done fast" which, more often than not, has meant that they wrote stuff that wasn't well thought out, threw it over the wall, and moved on to their next gig. They always had a knack of moving on before management could connect the dots that all the operational problems were related to the person who originally wrote it and not the very-competent people fixing the thing. Your average manager doesn't seem to have the capability to really understand tech debt and how it impacts ability to deliver over time; and in many cases they'll talk about the "rock star" developer that got away with a glimmer in their eye.
Saw a post of someone on Hacker News the other day talking about how they were creating things faster than n-person teams, and then letting the "normies" (their words not mine) maintain it while moving on to the next thing. Thats exactly the kind of person I'd like to weed out.
Some genius MBA determined that people feel more rewarded by recognition and autonomy than pay, which is actually true. But it means that all the recognition and autonomy in the world won't make you stay if you can make 50% more somewhere else.
Th power structure that makes up a typical owners-vs-employees company demands that every employee be replacable. Denying raises & paying the cost of churn are vital to maintaining this rule. Ignoring this rule often results in e.g. one longer-tenured engineer becoming irreplacable enough to be able to act insubordinately with impunity.
A bit bleak but that's capitalism for you. Unionization, working at a smaller companies, or at employee-owned cooperatives are all alternatives to this dynamic.
But that means there's no need for entry-level glassblowers, and everyone in the field with any significant experience is super old. The pipeline has been dead for a while now.
Not disagreeing that this is happening in the industry but it still feels like a missed opportunity to not hire juniors. Not only do you have the upcoming skill gap as you mention, but someone needs to instruct AI to do these menial/easy tasks. Perhaps it's only my opinion but I think it would be prudent to instead see this as just having junior engineers who can get more menial tasks done, instead of expecting to add it to the senior dev workflow at zero cost to output.
Basically this type of maintenance work for any sufficiently complex codebase. (Over 20k LOC)
When I was an QA intern / Software Dev Intern. I did all of that junk.
That said, you hit on something I've been feeling, the thing these models are best at by far is stuff that wasn't worth doing before.
I've also written a lot of python 2 in my career, and writing python 3 still isn't quite native-level for me - and the AI tools let me make up for my lack of knowledge of modern Python.
Everything turned out fine. Turns out you don't really need to be able to perform long division by hand. Sure, you should still understand the algorithm at some level, esp. if you work in STEM, but otherwise, not so much.
There were losses. I recall my AP physics professors was one of the old school types (retired from industry to teach). He could find the answer to essentially any problem to about 1-2 digits of precision in his head nearly instantly. Sometimes he'd have to reach for his slide rule for harder things or to get a few more digits. Ain't no one that can do that now (for reasonable values of "no one"). And, it is a loss, in that he could catch errors nearly instantly. Good skill to have. A better skill is to be able to set up a problem for finite element analysis, write kernels for operations, find an analytic solution using Mathematica (we don't need to do integrals by hand anymore for the mot part), unleash R to validate your statistics, and so on. The latter are more valuable than the former, and so we willingly pay the cost. Our ability to crank out integrals isn't what it was, but our ability to crank out better jet engines, efficient cars, computer vision models has exploded. Worth the trade off.
Recently watched an Alan Guth interview, and he made a throwaway comment, paraphrased: "I proved X in this book, well, Mathematica proved...". The point being that the proof was multiple pages per step, and while he could keep track of all the sub/superscripts and perform the Einstein sums on all the tensors correctly, why??? I'd rather he use his brain to think up new solutions to problems, not manipulate GR equations by hand.
I'm ignoring AGI/singularity type events, just opining about the current tooling.
Yah, the transition will be bumpy. But we will learn the skills we need for the new tools, and the old skills just won't matter as much. When they do, yah, it'll be a bit more painful, but so what, we gained so much efficiency we can afford the losses.
So, there are two parts to this:
The first is that a lot of those tasks are non-trivial for someone who isn't a digital native (and occasionally trivial for people who are). That is to say that I often found myself doing tasks that my bosses couldn't do in a reasonable time span; they were tasks which they had ALWAYS delegated, which is another way of saying that they were tasks in which proficiency was not necessary at their level.
This leads into the second part, which is that performing these tasks did not help me advance in relevant experience at all. They were not related to higher-level duties, nor did they endear me to the people who could have introduced me to such duties. My seniors had no interest in our growth as workers; anyone who wanted to see that growth had to take it into their own hands, at which point "junior-level" jobs are only worth the paycheck.
I don't know if it's a senior problem generally, or something specific to this cohort of Boomer/Gen-X seniors. Gun-to-my-head, I would wager the latter. They give enough examples in other arenas of public life to lend credence to the notion that that they simply don't care what happens to their juniors, or to their companies after they leave, particularly if there is added hassle in caring. This is an accusation often lobbed at my own generation, to which I say, it's one of the few things our forebears actually did teach us.
Yet again, AI is just a cover for mismanagement.
We had code school grads asking for $110-$130. Meanwhile, I can hire an actual senior engineer for $200 and he/she will be easily 4x as productive and useful, while also not taking a ton of mentorship time.
Since even that $110 costs $140, it's tough to understand how companies aren't taking a bath on $700/day.
Who knows if we'll even need senior devs in 5 years. We'll see what happens. I think the role of software development will change so much those years of technical experience as a senior won't be so relevant but that's just my 5 cents.
While the work seems to take similar amounts of time, I spend drastically less time fixing bugs, bugs that take me days or God forbid weeks, solved in minutes usually, sometimes maybe an hour if its obscure enough. You just have to feed the model enough context, full stack trace, every time.
Man, I wish this was true. I've given the same feedback on a colleague's clearly LLM-generated PRs. Initially I put effort into explaining why I was flagging the issues, now I just tag them with a sadface and my colleague replies "oh, cursor forgot." Clearly he isn't reading the PRs before they make it to me; so long as it's past lint and our test suite he just sends the PR.
I'd worry less if the LLMs weren't prone to modifying the preconditions of the test whenever they fail such that the tests get neutered, rather than correctly resolving the logic issues.
The bad product managers have become 10x worse because they just generate AI garbage to spray at the engineering team. We are now writing AI review process for our user stories to counter the AI generation of the product team. I'd much rather spend my time building things than having AI wars between teams.
Which stands to reason you'll need less of them. I'm really hoping this somehow leads to an explosion of new companies being built and hiring workers , otherwise - not good for us.
Depends on how much demand there would be for somewhat-cheaper software. Human hours taken could well remain the same.
Also depends on whether this approach leads to a whole lot of badly-fucked projects that companies can’t do without and have to hire human teams to fix…
I've found Opus 4.5 as a big upgrade compared to any of the other models. Big step up and the minor issues that were annoying and I needed to watch out for with Sonnet and GPT5.1.
It's to the point where I'm on the side of, if the models are offline or I run out of tokens for the 5 hour window or the week (with what I'm paying now), there's kind of no use of doing work. I can use other models to do planning or some review, but then wait until I'm back with Opus 4.5 to do the code.
It still absolutely requires review from me and planning before writing the code, and this is why there can be some slop that goes by, but it's the same as if you have a junior and they put in weak PRs. Difference is much quicker planning which the models help with, better implementation with basic conventions compared to juniors, and much easier to tell a model to make changes compared to a human.
I guess it depends on the project type, in some cases like you're saying way faster. I definitely recognize I've shaved weeks off a project, and I get really nuanced and Claude just updates and adjusts.
which means either devs will take over architectural roles (which already exist and are filled) or architects will take over dev roles. same goes for testing/QA - these are already positions within the industry in addition to being hats that we sometimes put on out of necessity or personal interest.
This is mostly a good thing provided you have a clear separation between solution exploration and actually shipping software - as the extra work put into productionizing a solution may not be obvious or familiar to someone who can use AI to identify a bugfix candidate, but might not know how we go about doing pre-release verification.
Going to throw out another anecdote here. At a company that a number of my friends work for (a fortune 50), they are currently making record profits that they loudly brag about during employee townhalls. They also are in the process of gutting multiple departments as fast as possible with little regard for the long term consequences. This is not the only company that I know of acting in this way (acting like they're about to go bankrupt when in fact they are seeing record profits).
To me the societal risk is that an entire generation of employees becomes extremely jaded and unmotivated, and fairly so. We used to work under the assumption that if our company is successful, then the employees would be successful. Record profits == raises for all, bonuses for all. And while we know that that connection was never that strong, it was strong enough to let us at least pretend that it was a law of universe.
That fundamental social contract is now at its breaking point for so many workers. Who can really blame people for putting in minimal effort when they have so much evidence that it will not be rewarded?
Today, a CEO can turn in a few quarters of really solid earnings growth, they can earn enough to retire to a life a private jets. Back when CxO pay was lower, the only way to make that kind of bank was to claw your way into the top job and stay there for a decade or more.
The current situation strongly incentivizes short-term thinking.
With today's very high, option-heavy compensation a CEO making long-term investments in the company rather than cutting staff and doing stock buybacks is taking money out of his own pocket.
It's a perverse incentive.
That's all over now; the growth spurt of a young software industry has given way to maturity. We'll be navigating an employment environment much like what the norm is in other technical professions with tougher standards and fiercer competition for good jobs.
But we only care about short term metrics now, so no one cares. They don't even care to develop the tools to understand it. It might as well not exist. Blame the young people and move on.
This entire discussion sounds crazy to me. If you want socialism, vote for socialism. If you want raw unfiltered capitalism, vote for the billionaire. You can't vote for the billionaire and expect safety nets. That's madness.
You are not wrong, but the contract is/was metaphorical. For a long time people were able to make a living for themselves by studying hard (usually STEM) and end up with a career which payed off. That was the invisible "contract". Hell I went to university for things which seem like academic navel gazing, but I still got a good tech job on the other side. That's not the reality for a lot of graduates nowdays who take more practical degrees at masters and phd levels.
Again even if the literal statement is clearly false, it is the sentiment which matters, and this sentiment does not just apply to graduates. I think many just feel like working hard does not work anymore, especially in the face of housing, cost of living, job competition and social media flaunting the wealth of others.
I get the idea from my younger siblings, "Why try if you are already a looser."
Recessions like the GFC, the Dot Bomb, the early 90s, the Asian Financial Crisis, the early 80s, Stagflation, and others show otherwise.
The extended bull run that SWEs had from the early 2010s to 2022 was an outlier, and the whiplash being felt today is comparable to what law and finance grads faced in the 2010s, accounting majors in the 2000s, and Aerospace/MechE majors in the 1990s.
It doesn’t set the legal standard that profits must be maximized which is impossible.
Socialism has a specific meaning, it's not just a label we get to put on behaviors that we - or rather, specifically you in this case - don't like.
Or more to the point, productivity has consistently outpaced pay for most of the US workforce since the mid-1970s. That's ~50 years that companies have been ripping you off. It's only now you notice, because rent/mortgage/school/medical have finally become so much larger than pay.
Well now you get to live through the Great Depression and study it up close.
The alternate way of looking at it is that the 50s to mid 70s era saw a period of unprecedented prosperity and now we are just seeing a reversion to the mean.
http://web.archive.org/web/20200428221848/https://www.nytime...
A social contract is an implicit agreement that everyone more or less accepts without anything being necessarily legally binding.
For example, the courtesy of two weeks notice in the US is a social contract: there’s nothing legally requiring it, but there are _social_ consequences (ie: your reference might be less positive) if you don’t follow it.
Everything that’s kind of in an employee’s favor is not socialism. You don’t have to like the idea of “work hard, help the company do well, get rewarded,” but that isn’t socialism. It’s just a thing you don’t like.
The top 10% of income earners in the US account for 50% of consumer spending. LMK if you think that's part of the contract. https://www.marketplace.org/story/2025/02/24/higher-income-a...
I also think though that individual experiences of this kind are more about specific companies maturing than a widespread culture shift. A lot of people on these forums worked in tech companies that are relatively young and have changed a lot over the past two decades.
The boom-bust recession cycle is roughly every 10 years. You can't say that AI is impacting hiring when your data just looks like the typical 10 year cycle. Your data needs to go back further.
That being said, what's more likely going on:
1: There are always periods where it's hard for recent college grads to get jobs. I graduated into one. Ignoring AI, how different is it now from 10, 20, and 30 years ago?
2: There are a lot of recent college grads who, to be quite frank, don't work out and end up leaving the field. (Many comments in this thread point out how many junior developers just shouldn't be hired.) Perhaps we're just seeing many companies realize it's easier to be stricter about who they hire?
When I was starting, you were checked for potential as a trainee. In my case, options trading. They checked over that you could do some mental arithmetic, and that you had a superficial idea of what trading was about. Along with a degree from a fancy university, that was all that was needed. I didn't know much about coding, and I didn't know much about stochastic differential equations.
A couple of weeks ago, a young guy contacted me about his interview with an options trading firm. This guy had spent half a year learning every stat/prob trick question ever. All those game theory questions about monks with stickers on their foreheads, all the questions about which card do you need to turn over, the lot. The guy could code, and had learned a bunch of ML to go with it. He prepared for their trading game with some really great questions to me about bet sizing.
I was convinced he was simply overly nervous about his prospects, because I'd never met someone so well prepared.
Didn't get the job.
Now I can assure you, he could have done the job. But apparently, firms want to hire people who are nearly fully developed on their own dime.
When they get their analyst class, I guess there is going to be nobody who can't write async python. Everyone will know how to train an ML on a massive dataset, everyone will already know how to cut latency in the system.
All things that I managed to learn while being paid.
You gotta ask yourself whether we really want a society where people have to already know the job before they get their first job. Where everyone is like a doctor: already decided at age 16 that this was the path they wanted to follow, choosing classes towards that goal, and sticking with it until well into adulthood. And they have to essentially pay to get this job, because it comes at at cost of exploring other things (as well as actual money to live).
If you attend a well-known college that bigco's hire from frequently, there's a lot of knowledge floating around about interview prep, hiring schedules, which companies pay the best, etc. Clubs host "interview prep workshops" where they'd teach the subject matter of interviews, host events(hackathons, case competitions, etc.) to help you bolster your resume for applying to these bigco's. So just by attending a better/fancier school, you'd have pretty decent odds of eventually getting a job at one of these prestigious places.
If you were to attend a less prestigious school, regardless of your aptitude or capability, the information asymmetry is so bad that you'll never learn of the prerequisites for even being considered for some of these roles. Not many upperclassmen will have interned at fancy employers, so they won't be there to help you drill dynamic programming/black-scholes/lbo models, and won't tell you that you need to have your applications prepped by a certain date, and won't tell you that you should be working on side projects/clubs, etc.
I suppose that the apprenticeship model biases towards people that already have connections, so perhaps inequality was already bad, whereas now we just have an information asymmetry that's more easily solvable.
With the way higher-ed works in the US, and the way certain schools opportunity hoard to an insane degree, that is effectively already the case for whole industries and has been so for decades at this point. It's practically an open secret that getting into some schools is the golden ticket rather than the grades you earn while there. Many top schools are just networking and finishing schools for whole "elite" industries.
A smaller size company, perhaps in a lower COL city, might have a more "human" side to them, simply because they can't afford all the nonsense.
You don't need a fancy school to get into a top firm anymore. You have to master the hell out of the interview.
I'm sure that's true in some areas, but our last hire I was shocked at the ridiculous lengths the applications would go to to avoid putting in even a minimum effort to apply for the job. Like the Van Halen brown M&M test, we put a line in the middle of the job advert saying "If you've read this, put your favorite color in at the top of your job application message. We had low double digits % of people who would do that.
Honestly, on our next hiring round, I think I'm going to make people fill out a google form to apply, and have any of our job posts say "Apply at <URL>" and completely ignoring any apps we get through Indeed or the like. We had a team of 3 people reviewing applications for an hour or two a day for a month and most of the responses were just human slop.
ChatGPT was pretty useless when it first released. It was neat that you could talk to it but I don't think it actually became a tool you could depend on (and even then, in a very limited way) until sometime in 2024.
Basically:
- the junior hiring slowdown started in 2022.
- but LLM's have only really been useful in a work context starting around 2024.
As for this point:
> According to very recent research from Stanford’s Digital Economy Lab, published in August of this year, companies that adopt AI at higher rates are hiring juniors 13% less
The same point stands. The junior hiring slowdown existed before the AI spend.
The AI wave didn't start yet. Will hit in 26/27
Default "people have value because human attention solves problems", has become default "existing org structure has value because existing revenue streams are stable."
The idea of a company used to contain an implied optimism. "If we get capable people together, we can accomplish great things!" Now that optimism has been offloaded to the individual, to prove their worth before they can take part.
Let's say you hire your great new engineer. Ok, great! Now their value is going to escalate RAPIDLY over the next 2-3 years. And by rapidly, it could be 50-100%. Because someone else will pay that to NOT train a person fresh out of college!
What company hands out raises aggressively enough to stay ahead of that truth? None of them, maybe a MANGA or some other thing. But most don't.
So, managers figure out fresh out of college == training employees for other people, so why bother? The company may not even break even!
That is the REAL catch 22. Not AI. It is how the value of people changes early in their career.
Sadly this is not as common as it should be - but I've also mentored folks at FAANGs who got promoted after 1y at the new-hire level because they were so clearly excelling. The first promotion is usually not very hard to attain if you're in the top quartile.
I have a friend of a friend in his mid 20s who finished a masters degree in data science focused on AI. There isnt a job for him and I think hes given up.
In Letters to a Young Poet Rilke responded to a young aspiring poet who asked how a person knows whether the artistic path is truly their calling:
> “There is only one thing you should do. Go into yourself. Find out the reason that commands you to write; see whether it has spread its roots into the very depths of your heart; confess to yourself whether you would have to die if you were forbidden to write. This most of all: ask yourself in the most silent hour of your night: must I write? Dig into yourself for a deep answer. And if this answer rings out in assent, if you meet this solemn question with a strong, simple "I must," then build your life in accordance with this necessity; your whole life, even into its humblest and most indifferent hour, must become a sign and witness to this impulse.”
How do I respond to this friend of a friend? Is data science or coding in general the path for you only if you would rather die than stop merging pull requests into main every day even when nobody is paying you?
Is coding the new poetry?
What do I tell this guy?
And no, coding is not the new poetry. I wish people would stop spamming this website with doomer nonsense like this.
The other place you will meet struggling artists is sports. Train several times a week, neglect your social life, your studies, just learn how to chase after a ball.
Only people who are crazy driven will actually do this. The ones who don't make it, they try to climb up from lower league clubs. They go on and on, carving out a career.
But most kids do not have a burning passion for anything. They are curious, they're smart, they want to explore the world. But they haven't found a calling. If they try to go through the eye of the needle, they find it's quite hard, because those paths are taken by guys with a mental lock on a certain career.
What to tell the guy? He's picked the subject that is the most useful for learning about the world. Go around and look at things. There's so much that a person who can code and can deal with statistics can apply himself do.
1. The industry cannot define the terms junior or senior.
2. Most seniors today are the prior generation’s juniors with almost no increase of capabilities, just more years on a resume.
The article asks about what happens when today’s seniors retire in the future. I would argue we are at that critical juncture now.
And although it hasn't discouraged me, I have to admit that I've been burned by juniors when caught in the middle between them and senior leadership on output expectations or strategy because frankly it's much more challenging to mentor how to navigate company politics than it is to mentor professional coding acumen. I want to be humble here. I don't think that's the junior's fault.
It feels like these problems go a lot deeper than AI. Most shops want software teams that are either silently embedded black boxes that you insert rough instructions into and get working software as output or an outsourced team. We've all experienced this. It seems silly to deny that it's directly related to why it's so hard to mentor or hire juniors.
I think you succeeded overall at your goal! Thanks for replying. You encouraged me to go back and read your article more closely.
It is insane how much screwed over we are. I am about to turn 30 soon with 5 YoE, PhD in ML which supposedly is the cutting edge stuff. Yet I have no prospects to even buy a tiny flat and start “normal life”. AI eats its own tail, I have no idea what I should do and what to learn to have any sensible prospects in life.
This is kind of like saying “Get your flight hours in on Microsoft Flight Simulator and then Delta Airlines will hire you.”
All unpaid and in your spare time between your two minimum wage jobs, of course.
Its a double edged sword too. I see it in my biz -- its easier to spend 40 hours training a model how to do things the way we like rather than hire someone junior and spend a month+ on onboarding. We are noticing hitting a wall to a certain point with clients still wanting to talk to a real person, but I can see that changing in the next ~5 years. Zero idea what happens to those junior folks that used to get trained (me being one that sat through a 3mo onboarding program!).
There is a fair bit of anecdotal evidence that junior hiring--at least in the software space--is fairly difficult currently. Via internships at good schools etc. may be better but I have to believe that off the street from bootcamps and the like is pretty tough.
lots of "seniors" via title inflation dont have fundamentals anyways - hence a lot of broken software in the wild & also perverse incentives like Resume driven development. A.I is built on badly written open source code.
because once you have the fundamentals, built a few things - you would've battle scars which makes someone a senior
not the 'senior' we see in big corps or places cosplaying where promos are based on playing politics.
I am an older gen-z and launching my career has felt nigh on impossible. At my first job, the allergy toward mentorship this article mentions was incredibly palpable. None of my several managers had management experience, and one of them openly told me they didn't want to be managing me. The one annual review I got was from someone who worked alongside me for a week.
Follow that experience up with a layoff and a literally futile job search, and its hard to be optimistic about building much of a career.
https://metr.org/blog/2025-03-19-measuring-ai-ability-to-com...
And then we have others claiming that AI is already having such a significant impact on hiring that the effects are clearly visible in the statistics.
AI companies could never make any money (statement about the future, and about AI companies, and finances). And AI could be having a visible effect on hiring today (statement about now, and about non-AI companies, and about employment).
They don't have to both be true, but they do not inherently contradict each other.
This is because "management" includes a bunch of BS that few engineers want to actually deal with. Performance discussions, 1:1s, being hauled into mandatory upper-level meetings, not actually building things anymore, etc. If it was simply pairing with juniors from time to time to hack on things and show them cool stuff, it would be wonderful.
> The most common answer from students when asked what they needed was a mentor who had just been in their shoes a few years ago, a surprising and heartening answer.
Mentoring is difficult; especially in today's world, where we are taught to despise older folks, and encouraged to treat everyone that we work with, as competitors.
For myself, I'm happily retired from the Rodent Rally, and find that LLMs have been a huge help, when learning new stuff.
This kid would not accept seniority, would constantly and publicly try to divert from the stack we worked with, he would not take any input on his work without actively fighting the process and will crowd the conversation at team meetings with never-ending Reddit-tier takes that contributed to nothing other than fill his ego.
In the end I managed to convince my boss to get him out, and he now works in Cyber, which will probably cause even more damage in the long run, but at least I can now say "not my problem".
You should have stopped to think about why such a person was hired in the first place, while there are an endless supply of very talented, hard working, and honest young people who would never be given a chance at all.
But if I guess right, hiring is not seen as the responsibility of your company. And that's the core of the problem.
What world is this? This not match my experiences at all. Is this a common sentiment among your peers?
The people who will give you credit where it's due and lift you in my experience are more rare than not, and almost always an older member, which perhaps is because they don't feel the need to prove themselves as much anymore.
Despising older folks has been a thing a long time, made famous by Zuck starting out. Now that he's older, I wonder if he still feels the same way...
and before that is was hippies with "Don't trust anyone over 30" which became deeply ingrained in at least American culture.
The difference, this time, is the CEO is now a younger person, when they used to always be someone in at least their forties (more often fifties or sixties).
9 times out of 10 it goes the other way around. Most young people have only had very negative interactions with their seniors, which has been wholly on the part of the senior. The current young generation is very respectful towards older people.
This has not been my experience.
I worked for a company that prized seniority, and I regularly dealt with folks older than me, more experienced than me, more capable than me, and willing to help me out. I worked there for almost 27 years, and it was awesome.
In my experience, I'm usually written off as an "OK Boomer," before I've even had a chance to open my mouth to prove it (or not).
My fave, is when we have a really promising text-only relationship, then, the minute they see me, it goes south.
That opportunity is now lost. In a few years we will lack senior engineers because right now we lack junior engineers.
All is not lost however. Some companies are hiring junior engineers and giving them AI, and telling them to learn how to use AI to do their job. These will be our seniors of the future.
But my bigger concern is that every year the AI models become more capable, so as the "lost ladder" moves up, the AI models will keep filling in the gaps, until they can do the work of a Senior supervised by a Staff, then the work of a Staff supervised by a Principal, and so on.
The good news is that this is a good antidote to the other problem in our industry -- a lot of people got into software engineering for the money in the last few decades, not for the joy of programming. These are the folks that will be replaced first, leaving only those who truly love solving the hardest problems.
Single-Payer health care would help our industry immensely if it came to pass.
Imagine having no fear any more.
It actually might help.
This is the model used in Eastern Europe and India - the vast majority of new grads are hired by mass recruiters like EPAM, WITCH, Deloitte, and Accenture at low base salaries but also the expectation that they self train and learn how to become productive SWEs, or they just stagnate at the low rungs. Japan, Korea, and China use a similar model as well.
But honestly, even FTE isn't much of a headache if I can hire a junior SWE for $60k-80k, invest in training them, and then bumping salaries to market rate after they have matured. This is what a number of traditional F500s like Danaher [0], AbbVie [1], and Capital One [2] do via Leadership and Trainee Development Programs, and honestly, it's much easier to make a case to hire someone if they have a couple of years of real world work experience.
[0] - https://jobsblog.danaher.com/blog/leadership-development-pro...
[1] - https://www.abbvie.com/join-us/student-programs.html
[2] - https://www.capitalonecareers.com/get-ahead-with-early-caree...
Has anyone ever seen a manager mentoring ICs? I haven't. This is a senior/staff/principal responsibility.
We have an intern that is finishing a four year computer science degree that has no clue what git is, never used a log and all he presents is AI garbage.
I find it profoundly depressing to try and teach someone who has no interest in the craft.
80% of the candidate I interview pass (leetcode style coding interview, as mandated by the company). This is actually annoying because I'll probably have to raise the bar and start rejecting very good candidates.
I'm sorry but to me this part reads like a humorous phrase that's popular in some circles in my region which goes:
"Maybe <list of negative things, usually correct characterizations of the speaker>, but at least <something even worse>"
The companies I worked for used automated coding quizzes like Codility to weed out the worst applicants, but I suspect you're already doing that.
How is them knowing when binary search is useful relevant to what they'll be doing at work should they get hired?
Because of our work is changing, faster than ever, not day to day but over time. You need a foundation to handle that change. My 2X years experience showed me that the people who has strong foundation handle the transition well. If I'm going to hire and invest and mentor, I want that person to be successful.
If I were to graduate today, I'd be royally screwed.
But looking back on my 30 years of working (including in high school), every job I've ever had I got through personal referrals or recruiter reach-outs. I've gotten to interviews before but never actually taken a job without a personal connection.
Will say what's gotten me hired are my projects eg. robotics or getting published online for hardware stuff, I work in the web-cloud space primarily though, hardware would be cool but hard to make that jump
I feel that too. I am a self-taught dev. Got a degree, but not in CS. I don't know if I could get hired today.
Not sure how to fix it; feels like the entire industry is eating the seed corn.
Wages for your typical engineer stopped going up 5+ years ago. The joke of senior FAANG engineers making $400k has been a meme for over 5 years. Yet, inflation has done over 20% in 5 years? Look at new offers for people joining the majority of positions available at public tech companies. You're not seeing $500k offers regularly. Maybe at Jane Street or Anthropic or some other companies that are barely hiring - all of which barely employ anyone compared to FAANG. You're mostly seeing the same $350-400k/yr meme.
The reason we're not employing new grads is the same reason as the standards getting much more aggressive. Oversupply and senior talent has always been valued more.
Not true for Western Europe. Getting more than 60k euros yearly as a software engineer was hard in 2019, it's now basically impossible to get less than that.
They don't have to hire in any given country.
Given the current state of affairs in the US, I'd be moving the balance elsewhere too.
I started in tech in the late 70s. I can say this break happened during the Reagan Years with a bit of help from the Nixon Years.
It is also something which is likely to be quite harmful, since it selects for people who are great at networking over people who have good technical skills. Obviously interpersonal communication is important, but how well a 20 year old in University performs at it should not doom or make their career.
And even people with bad social skills deserve to exist and should be allowed into their chosen career. Being someone who does good work and is respectful, but not overly social, should be good enough.
I have been unable to get a tech job for months so I’ve looked into retraining in a new field and every single one has some up front large cost via either paying for schooling or situations like mechanics needing to bring their own tools.
The standard US company has completely shed all training costs and put the expectations on laborers to train themselves. And you’re shit out of luck if their requirements change during your training as so many college graduates who picked comp sci are currently learning
The economics of providing every new grad a $150k TC offer just doesn't work in a world with the dual pressures of AI and async induced offshoring.
Heck, once you factor in YoE, salaries and TCs outside the new grad range have largely risen because having experienced developers really does matter and provides positive business outcomes.
State and local governments needs to play the same white collar subsidy game that the rest of the world is playing in order to help fix the economics of junior hiring for white collar roles. This is why Hollywood shifted to the UK, VFX shifted to Vancouver, Pharma shifted to Switzerland, and Software to India.
It was always a weird US thing driven by huge companies and VCs. In other western, developed countries ~$50k equivalent would be normal. Even adjusting for other provided social benefits, there's still a long way down...
Building a GCC ends up costing around $60k-$100k per head in operating costs without subsidizes, and deploying vibe coding tools to fully replace an entire dev team end up in a similar price range (but conversely they could arguably enhance productivity for new grads and hires eg. Glean Search).
If that were to actually happen, we'd wind up excluding many of our greatest technical performers while drowning in a sea of would-be middle managers. People skills matter, but so do many other strengths that don't always overlap with being naturally good at navigating interpersonal dynamics.
But some of the best "people" people that I've seen in my career have been the most technical, also. They were really good at being able to communicate the value of their solution, the problems it solves, and risks and rewards. They could get buy-in from stakeholders and other teams. They could listen empathetically when faced with issues and blockers. And they did so with authenticity and genuine care because they were passionate about software engineering.
I believe those are skills that can be learned and practiced and that you don't have to be necessarily "social" to grow in that area.
They forgot to add in "Aging billionaires spend a trillion dollars on longevity research" which results in "110 year old Senior engineers still working"
apologise for inflicting this era on them and teach them to be entrepreneurial, teach them how to build, teach them rust on the backend, teach them postgres, teach them about assets maintaining value while money loses its
tell them to never under any circumstances take on a mortgage, especially not the 50 year variety. tell them to stay at home for as long as possible and save as much as possible and put it into assets: gold, silver, bitcoin, monero
they must escape the permanent underclass, nothing else matters
That is some hard stereotyping being generalised on a platform with worldwide reach. You may wish to rethink what led you to that statement.
Despite everything, I like it that humanity exists. I want humanity to continue to exist. I reject any notion or attitude that would, taken to its logical conclusion, result in the extinction of humanity. And, even more so, that would result in the extinction of my family and lineage. For your sake, I hope that this is just edgy horseshit that you will soon grow out of.
The continued reliance on say, COBOL, and the complete lack of those developers comes to mind.
Even before LLMs, there were periods recently where multiple companies had "senior only" hiring policies. That just inflated what "senior" was until it was basically 5 years of experience.
This time seems a bit different, however. There are both supply and demand side problems. The supply of students it tainted with AI "learning" now. Colleges haven't realized that they absolutely have to effectively crack down on AI, or the signal of their degrees will wither to nothing. The demand side is also low, of course, since the candidates aren't good, and AI seems to be a good substitute for a newly graduated hire, especially if that hire is just going to use the AI badly.
So the irony here is that LLMs are actually going to be decent at COBOL by default. And other uncommon/esoteric codebases. For example I vibe-ported some Apple ii assembly to modern C/SDL and... it works. It's stuff that I just wouldn't even attempt at manual development speed. It may be actually an easier path than training someone to do things, as long as you have a large enough test suite or detailed enough requirements.
There is an unbounded amount of opportunity available for those who want to grab hold of it.
If you want to rely on school and get the approval of the corporate machine, you are subject to the whims of their circumstance.
Or, you can go home, put in the work, learn the tech, become the expert, and punch your own ticket. The information is freely available. Your time is your own.
Put. In. The. Work.
Because those senior people will NOT be around forever. And they have killed their talent development and knowledge transfer pipelines.
Either direction you take it, this feels like a lose-lose situation for everyone.
People don't think in terms of shared commons and that if all companies are doing the same thing then there won't be much of a "senior" market left to hire.
We're not hiring a lot of rotary phone makers these days.
Who is hiring their own shoe-smith? It's been 30-ish years since my carpenter father last had work boots resoled.
It's almost as if... technology and economy evolve over time.
For all the arguments software people make about freedom to use their property as they see fit, they ignore non-programmers use of personal technology property is coupled to the opinions of programmers. Programmers ignore how they are middlemen of a sort they often deride as taking away the programmer's freedom! A very hypocritical group, them programmers.
What's so high tech about configuration of machines with lexical constructs as was the norm 60+ years ago? Seems a bit old fashioned.
Programmers are biology and biology has a tendency to be nostalgic, clingy, and self selecting. Which is all programmers are engaged in when they complain others won't need their skills.
Firstly, we've been here before, specifically in 2008. This was the real impact of the GFC. The junior hiring pipeline got decimated in many industries and never returned. This has created problems for an entire generation (ie the millenials) who went to college and accumulated massive amounts of debt for careers that never eventuated. Many of those careers existed before 2008.
The long-term consequences of this are still playing out. It's delaying life milestones like finding a partner, buying a house, having a family and generally just having security of any kind.
Secondly, there is a whole host of other industries this has affected that the author couldn't pointed to. The most obvious is the entertainment industry.
You may have asked "why do we need to wait 3 years between seasons of 8 episodes now when we used to put out 22 episodes a year?" It's a good question and the answer is this exact same kind of cost-cutting. Writers rooms got smaller and typically now the entire season is written and then it's produced when the writers are no longer there with the exception of the showrunner, who is the head writer.
So writers are rarely on set now. This was the training ground for future showrunners. Also, writers were employed for 9 months or more for the 22 episode run and now they're employed for maybe 3 months so need multiple jobs a year. Getting jobs in this industry is hard and time-consuming and the timing just may not work out.
Plus the real cost of streaming is how it destroyed residuals because Netflix (etc) are paying far fewer residuals (because they're showing their own origianl content) and those residuals sustained workers in the entertainment industry so they could have long-term careers and that experience wouldn't be lost. The LA entertainmen tindustry is in a dire state for these reasons and also because a lot of it is being offshored to further reduce costs.
Bear in mind that the old system produced cultural touchstones and absolute cash cows eg Seinfeld, Friends, ER.
Circling back, the entire goal of AI Is to displace workers and cut costs. That's it. It's no more compolicated than that. And yes, junior workers and less-skilled workers will suffer first and the most. But those junior engineers would otherwise be future senior engineers.
What I would like for people to understand that all of this is about short-term decisions to cut costs. It's no more complicated than that.
For example, the death of optical media has had a massive impact on the entertainment industry, particularly movies. Matt Damon has spoken about this, on Hot Wings of all places [1].
Streaming began as a alternate path for monetizing old content other than cable TV syndication. And it was excelelnt for this in the early years. At that time it was bonus income.
But streaming also ushered in a golden age for watching serialized content so it's a mixed bag.
Loss of writers is just one factor. Filming fewer episodes, moving production out of the US and loss os residuals all contribute to killing this ecosystem.
Furthermore, this is why the humanities matter: because human relationships matter.
where do you network? what do you network with these other humans on?
I do think I could get a job from my network because I’ve worked in the industry for years and done good work; I’m a little skeptical of advice to network to junior/new grads. I at least ignore those LinkedIn requests
Unfortunately, if you network to get a job, you're already months behind.
As I talk to college kids, I try to get them to find opportunities to network while they're in school, before they're desperate to get that first internship or job. They want to come at their search from a place of confidence, not anxiety.
There are so many meetups at universities (at least at the one near me) that they can mingle with the working world, and they stand out because they're there when it's mostly professionals.
Student or not, networking works best in-person when possible (conferences, meetups, professional events) where you get to know people and get truly curious about them. But after that, it involves following up and keeping the relationships warm, showing that you are interested in people professionally and can possibly help them with their problems, and that's no trivial investment.
If you do that enough, then you will build trust and rapport to create some opportunities, but it's admittedly a long game. It also has to be genuine or else people end up feeling used.
I think that there is a blocker that a lot of people have against networking in general because it feels gross and insincere. We've all seen people do it poorly, and so we avoid it, but it can be really fulfilling if done well.
I have had so many people reach out to me out of the blue when they're looking for job, after literally leaving me on read in LinkedIn DMs. And giving them the benefit of the doubt, I meet with them and try to help them out, and then I never hear from them again after they find a job. It doesn't feel great, which is why I always suggest being intentional about nurturing your close professional relationships. It doesn't have to be anything grand; just being kind and courteous goes a long way.
This is terrible advice. Apply, cold call, create projects, job fairs, get co-op opportunties and ambush are better ways. Hackathons, github projects or small businesses can help. 9/10 CEOs will ignore your cold outreach but some won't.
Getting too busy making friends at the Greek houses will land you a marketing role if you are lucky. People need to associate you wish your craft. If they know you as a social guy you will get social roles. Any developer too social is suspect for many and ends up at best a pm.
When I was coming up people went into hardware/certifications to bridge the gap but moving from hardware to software was a gap too big for many as they became typecast.
However, it takes time.
If you need a job right now, it won't happen via ordinary networking, by which I mean networking with people whose job isn't recruitment.
If you think of networking as a pleasant way to keep some interesting ideas flowing and making some friends, circulation will get you things that you never even thought of.
(The best professional recruiters actually stir the pot for years and years before getting a return. Constantly keeping up with what various people are doing, just in case the time is right for someone to move on.)
I'm actually a bit surprised, because as a young guy I didn't do any networking beyond connecting with colleagues, which certainly helped. But I'm finding lots of young guys will reach out to me for advice. It's a good habit, but one I suspect more than half the population doesn't practice.
New grads (myself included, back then), tend to discount Tier 2, because in their head the hiring process is looking for the single applicant with the best technical skills. When in reality, it's a lot more of a "who can we get quickly, who won't have a negative impact on team output or morale". Parents, Parent's friends, friends, and friend's parents all can fall into Tier 2, and absolutely should asked about whether their workplaces are hiring, and if so, if they could provide a recommendation.
Tier 3 is mostly useful for finding out about positions that don't necessarily get publicized, but depending on mutual connection to the shared acquaintance, might be willing to offer a recommendation.
With regards to where to network, that comes down to engaging with social gatherings that bring together a spread of people that aren't exclusively your direct peers. That's the stumbling block a lot of new grads find themselves in, which is that all their social time is spent with other new grads (or worse still, nobody at all). Clubs, parties thrown by friends' parents, university alumni events, hell, join the Oddfellows (YMMV, some lodges stopped recruiting after Vietnam). Conferences, whether technical or not. Hell, a step I recommend for everyone is going to bars and talking to strangers. Not highest density networking opportunity (except some gay bars in SF), but it's a pretty good environment to practice casual communication with people you have approximately nothing in common with, with very low stakes.
- go to events/conventions/join clubs related to programming (need to be located near a large city for this)
- talk to other students/self-learners and wait for them to get to the next step
I’ve been unemployed a long time and have been thinking of improving at networking. These are what I came up with.
If you're a senior, maintain relations with last year's graduating class (and with your placement services people).
If you get an internship, keep in touch with people there.
If you are a new grad: go to alumni events. Go to alumni events! GO TO ALUMNI EVENTS.
If you are still in school: talk to your alumni and career office; they will be able to connect you better.
If you are in High School: consider a university with a co-op program.
The value of fact-to-face connection should not be underestimated.
Again: this may be uncomfortable for some people, but it is the way of the world.
new grads will be fed to the meat grinder with no regards, its a closed shop unless you know someone
Good luck with causation/correlation vs the rise of LLM.
The article is self-serving in identifying the solutions ("do things related to the service we offer, and if that doesn't work, buy our service to help you do them better"), but it is a subject worth talking about, so I will offer my refutation of their analysis and solution.
The first point I'd like to make is that while the hiring market is shrinking, I believe it was long overdue and that the root cause is not "LLMs are takin' our jerbs", but rather the fact that for probably the better part of two decades, the software development field has been plagued by especially unproductive workers. There are a great deal of college graduates who entered the field because they were promised it was the easiest path to a highly lucrative career, who never once wrote a line of code outside of their coursework, who then entered a workforce that values credentialism over merit, who then dragged their teams down by knowing virtually nothing about programming. Productive software engineers are typically compensated within a range of at most a few hundred thousand dollars, but productive software engineers generally create millions in value for their companies, leading to a lot of excess income, some of which can be wasted on inefficient hiring practices without being felt. This was bound for a correction eventually, and LLMs just happened to be the excuse needed for layoffs and reduced hiring of unproductive employees[1].
Therefore, I believe the premise that you need to focus entirely on doing things an LLM can't -- networking with humans -- is deeply faulty. This implies that it is no longer possible to compete with LLMs on engineering merit, and I could not possibly disagree more. Rather than following their path forward, which emphasises only networking, my actual suggestion to prospective junior engineers is: build things. Gain experience on your own. Make a portfolio that will wow someone. Programming is a field that doesn't require apprenticeship. There is not a single other discipline that has as much learning material available as software development, and you can learn by doing, seeing the pain points that crop up in your own code and then finding solutions for them.
Yes, this entails programming as a hobby, doing countless hours of unpaid programming for neither school nor job. If you can't do that much, you will never develop the skills to be a genuinely good programmer -- that applied just as much before this supposed crisis, because the kind of junior engineer who never codes on their own time was not being given the mentorship to turn into a good engineer, but rather was given the guidance to turn them into a gear that was minimally useful and only capable of following rote instructions, often poorly. It is true that the path of the career-only programmer who goes through life without spending their own time doing coding is being closed off. But it was never sustainable anyways. If you don't love programming for its own sake, this field is not likely to reward you going forward. University courses do not teach nearly effectively enough to make even a hireable junior engineer, so you must take your education into your own hands.
[1] Of course, layoff processes are often handled just as incompetently as hiring processes, leading to some productive engineers getting in the crossfire of decisions that should mostly hurt unproductive engineers. I'm sympathetic to people who have struggled with this, but I do believe productive engineers still have a huge edge over unproductive engineers and are highly likely to find success despite the flaws in human resource management.
Thanks for giving it some thought and for your perspectives, they really help.
I have been seeing an uptick of articles on HN where someone identifies a problem, then amps it up a bit more and then tells you that they are the right ones to solve it for a fee.
These things should not be taken seriously and upvoted.
It's just an app, not a service, that my husband and I built (and quit our jobs for) that has a generous free trial. (Technically, right now it's completely free because it's in early access, so if you never upgrade, you could use it for free forever.)
The CTA at the end was just in an effort to talk to more people (for free) and see how we can help and make our software better. I come from the DevOps world, and they always say you have to first know how to do something really well manually before you can automate it, and that's what we're trying to do by talking to people (for free).
So, from a individual's perspective, figuring out how to meet people who will help you sidestep the "unwashed masses" pile of applications is probably the next most important thing after technical competence (and yeah, ranking above technical excellence).
That's exactly what the portfolio is for. Having an actual body of work people can look at and within a couple of minutes of looking think "wow, this person will definitely be able to contribute something valuable to our project" will immediately set you apart from every applicant who has vague, unreliable credentials that are only extremely loosely correlated with competence, like university trivia. You do need to get as far as a human looking at your portfolio, which isn't a guarantee on any given application, but once you get that far your odds will skyrocket next to University Graduate #130128154 who may have happened to get human eyes on their application but has nothing else to set them apart.
The general population is being rapidly sacked as a 'necessary' expense of criminal elites.
No one should be happy about this.
It wasn't too long ago that it was common to read threads on HN and other tech fora about universities graduating software engineers seriously lacking coding skills. This was evidenced by often-torturous interview processes that would herd dozens to hundreds of applicants through filters to, among other things, rank them based on their ability to, well, understand and write software.
This process is inefficient, slow and expensive. Companies would much rather be able to trust that a CS degree carries with it a level of competence commensurate with what the degree implies. Sadly, they cannot, still, today, they cannot.
And so, the root cause of the issue isn't AI or LLM's, it's universities churning people through programs and granting degrees that often times mean very little other than "spent at least four years pretending to learn something".
If you are thinking that certain CS-degree-granting universities could be classified as scams, you might be right.
And so, anyone with half a braincell, will, today, look at the availability of LLM tools for coding as a way to stop (or reduce) the insanity and be able to get on with business without having to deal with as much of the nonsense.
Nobody here makes a product or offers a service (hardware, software, anything) for the love of the art. We make things to solve problems for people and services. That's why you exists. Not to look after a social contract (as a comment suggested). Sorry, that's nonsense. The company making spark plugs makes spark plugs, they are not on this planet to support some imaginary public good. Solving the problem is how they contribute.
And, in order to solve problems, you need people who are capable of deploying the skills necessary to do so. If universities are graduating people who can barely make a contribution to the mission at hand, companies are going to always look for ways to mitigate that blocking element. Today, LLM's are starting to provide that solution.
So it isn't about greed or some other nonsense idealistic view of the universe. If I can't hire capable people, I will gladly give senior engineers more tools to support the work they have to do.
As is often the case, the solution to so many problems today --including this one-- is found in education. Our universities need to be setup to succeed or fail based on the quality of the education they deliver. This has almost never been the case. Which means you have large scale farming operations granting degrees that can easily be dwarfed by an LLM.
And don't think that this is only a problem a the entry level. I recently worked with a CTO who, to someone with experience, was so utterly unqualified for the job it was just astounding that he had been give the position in the first place. It was clearly a case of him not knowing just how much he didn't know. It didn't take much to make the case for replacing him with a qualified individual or risk damage to the company's products and reputation going forward.
A knowledgeable entry-level professional who also has solid AI-as-a-tool skills is invaluable. Note that first they have to come out of university with real skills. They cannot acquire those after the fact. Not any more.
NOTE: To the inevitable naive socialist/communist-leaning folks in our mix. Love your enthusiasm and innocence, but, no, companies do not exist to make a profit. Try starting one for once in your naive life with that specific mission as your guiding principle and see how far you'll get.
Companies succeed by solving problems for people and other companies. Their clients and customers exchange currency for the value they deliver. The amount they are willing to pay is proportionate to the value of the problem being solved as perceived by the customer --and only the customer.
Company management has to charge more than the mere raw cost of the product or service for a massive range of reasons that I cannot possibly list here. A simple case might be having to spend millions of dollars and devote years (=cost) to creating such solutions. And, responsible companies, will charge enough to be able to support ongoing work, R&D, operations, etc. and have enough funds on hand to survive the inevitable market downturns. Without this, they would have to let half the employees go every M.N years just because of natural business cycles.
So, yeah, before you go off talking about businesses like you've never started or ran a non-trivial anything (believe me, it is blatantly obvious when reading your comments), you might want to make an attempt to understand that your stupid Marxists professors or sources had absolutely no clue, were talking out of their asses, never started or ran a business, and everything they pounded into your brains fails the most basic tests with objective, on-the-ground, skin-in-the-game reality.