AI is very useful, but it so far doesn't write the type of code I can trust. Thus I use it but I carefully review everything it does.
I 100% agree with this in a individual person sense, but in a humanity sense someone does understand linux very deeply and is very intentional on how they change it which to me is how I gain trust in it.
does trust change when the entire SLDC is AI?
There are systems still in use from the 1960s (maybe before) - the original authors are at least retired and likely dead. I question how well the replacements understand all that. Sure they have had to dig in and understand some parts, but what about the parts that just keep working and don't need new features?
Perfect summary.
It's like we invented a world where you can finally, _finally_ speedrun an enormous legacy codebase and all patted ourselves on the back like that was a good thing.
If we could conjure pickaxes and electric power plants in a single day, that would be speedrunning.
He told it to calm down and just use php, it gave him 100 lines with no dependencies that worked the first time.
The Pieter Levels stack :)
Of course, this is ideal for solo entrepreneur. If you are employed, then you cannot finish it in 100 lines. How will you get paid to maintain it for the next ten years, and hire all your friends to help you?
I think this difference in incentives explains most of what we've been complaining about for the last twenty years.
ASM was machine specific. C was portable but required expert programmers. C++ was even more user friendly, but still very hard for normal people. Today, most anyone can write a program in Go.
The more we abstract, the less knowledge/expertise is needed. So yes, programs are being built by people who don't really understand what they are doing. That is intended.
Very few understand deeply what's happening within the computer between the cou and the bridges and the rest.
The fdiv bug in 1994 took us all by surprise because we were in a situation where bug couldn't exist in hardware, because it either works or it doesn't.
When I'm using firebase or aws, I don't know the underlying system, I don't know why some resources can be created with an underscore or other can't start with a number.
Yet it works.
We're working in layers where usually we only touch the last one. Yes, understanding the others is great to debug.
I'm even wondering whether we need tests when they are written by the same llm that wrote the code.
Of course we do. Otherwise we start trying to water crops with Brawndo
> Very few understand deeply what's happening within the computer between the cou and the bridges and the rest
But it's very very important that those people deeply understand it. We cannot replace their actual knowledge with LLM approximations of their knowledge
Not saying that's how it should be, that's just the world I predict in the not too distant future.
I love thinking, but most people I know seem to experience it as a form of physical pain.
by managing the risk of failure and technical debt with a financial instrument, i have a lot more freedom to move fast and break things, and scale aggressively
but if you just buy deep out of the money put options every year, say 50% price drop, in a 10b5-1 structure, its ok. you will get sued a bit more than usual
not much different than buying life insurance, except if your company crashes like monday.com , some schmuck who sold you the puts will have to pay you a tonne of cash then you can do a dilutionary rights issue or just use the cash buy a boat in miami and start something else
Most of you won't be able to answer that. And you already know it.
That's the conversation this industry needs to have. Not tomorrow. Now.
I hope you don't take this the wrong way and do continue writing - I enjoyed this piece, just wanted to give some constructive feedbackThere are reports of industries trying to use these tools to generate as much as possible, sure. There are also people generating bad art and unpleasant prose and using llms to generate nonsense they don’t pay attention to.
I don’t see why that implies that you or I lost interest in tinkering with toys we build. If I want to spend 4 weeks understanding oauth a little better by implementing a client, I still can and I still do.
Automating our builds absolutely didn’t create a cathedral of complexity while nobody noticed. It did mean I can open an Free Software project, read the build file and understand how to build the thing. That’s the opposite of generating complexity.
I worry about our future generations as much as the next person, but this low effort pabulum doesn’t represent the thoughtful industry and hobby that I love.
For most businesses, software is just a means to an end, they don't really care how high quality and thoughtful the systems they use are (e.g. look at any piece of "enterprise" software)
What LLMs have done is made much much easier for orgs to launch new features and services both internally and externally, without necessarily understanding the complexity.
For me, that's what this post tapped into. Many orgs already have more complexity than they can reasonably handle. Massively accelerating development, is not going to make that problem better :)
Don't get me wrong. I think the future is very bright for software. I have friends who are scientist and biomedical professionals and I am excited to see what they are able to do with the powers of software where they don't need to care about syntax and can only lead with their intentions.
The rage bait part is mainly my frustrations manifesting. As an SRE my annoyances came off a little bit when it comes to how fast developers are shipping vs how fast our rails can maintain things.
But I don't see it. Where is this glut of software?
Before AI assistance this simply wasn't in my possibility space. Not because I couldn't think through the problems, but because the gap between "I know what I want to build" and "I can actually build it" required years of skill acquisition I didn't have while raising my son and being a good husband (after some rough years ay).
The glut isn't visible yet because most of us aren't shipping to HN. We're shipping to our tiny audiences, our friends and family, our niche communities. The software exists ...it's just not venture-scale so nobody's writing about it.
I suppose if one is simultaneously ignorant when it comes to software and an expert at agentic workflow, then yeah sure, maybe--at the cost of how many tokens, though? But logically it seems that the former would preclude the latter.
Also, the "get it done in a weekend" seems to be a gross exaggeration.
Over the last couple of months I've seen a load of new "product launches" in my niche but when you look at them they're largely vibecoded and don't show deep understanding and sustainability, so it's pretty likely you'll never see them as successful businesses.
Looking at some of the related places like /r/sideproject/ there's a lot of releases and I'd be willing to suggest that most of them are using LLMs
There seems to be a lot of hype, and has been for years, but I’m not seeing it materialize as actual economic output. Surely by now there should be lots of businesses springing up to capture all of this value created by vibecoded software.
- CLI voice changer with cloned Overwatch voices on ElevenLabs.
- Brother P-Touch label maker using HTML/CSS. Their app is absolutely atrocious.
- Converted a FileMaker CRM into a Next.js/Supabase app.
- Dozens of drag-n-drop or 1-click/CLI tools. Think flattening a folder, a zip file.
- Dozens of Chrome Extensions and TamperMonkey user scripts. Think blocking ads with very targeted xpath.
But when I think about sharing them it feels like what's the point since anyone can make them themselves?
No one understanding what's going on inside of complex systems in financially constrained environments built and maintained by average, at best, engineers is the norm and is what keeps the world running.
None of that is a symptom of AI. The only change AI brings is that even first person developers don't know anymore what the fuck they just deployed.
I wanted to touch on this point but then this post started getting WAY too long.
>the only change AI brings is that even first person developers don't know anymore what the fuck they just deployed.
This to me is going to make your first point 100x worse in every damn way