Fast forward to now. I'm a Buffett nerd — big believer in compound interest as a mental model for life. I run compound interest calculations constantly. Not because I need to, but because watching numbers grow over 30-40 years keeps me patient when markets get wild. It's basically meditation for long-term investors.
The problem? Every compound interest calculator online is terrible. Ugly interfaces, ads covering half the screen, can't customize compounding frequency properly, no year-by-year breakdowns. I've tried so many. They all suck.
When vibe coding started blowing up, something clicked. Maybe I could actually build the calculators I wanted? I don't have to be a "real developer" anymore — I just need to describe what I want clearly.
So I tried it.
Two weeks and ~$100(Opus 4.5 thinking model) in API costs later: I somehow have 60+ calculators. Started with compound interest, naturally. Then thought "well, while I'm here..." and added mortgage, loan amortization, savings goals, retirement projections. Then it spiraled — BMI calculator, timezone converter, regex tester. Oops.
The AI (I'm using Claude via Windsurf) handled the grunt work beautifully. I'd describe exactly what I wanted — "compound interest calculator with monthly/quarterly/yearly options, year-by-year breakdown table, recurring contribution support" — and it delivered. With validation, nice components, even tests.
What I realized: my years away from coding weren't wasted. I still understood architecture, I still knew what good UX looked like, I still had domain expertise (financial math). I just couldn't type it all out efficiently. AI filled that gap perfectly.
Vibe coding didn't make me a 10x engineer. But it gave me permission to build again. Ideas I've had for years suddenly feel achievable. That's honestly the bigger win for me.
Stack: Next.js, React, TailwindCSS, shadcn/ui, four languages (EN/DE/FR/JA). The AI picked most of this when I said "modern and clean."
Site's live at https://calquio.com . The compound interest calculator is still my favorite page — finally exactly what I wanted.
Curious if others have similar stories. Anyone else come back to building after stepping away?
Since when did "average" people have time to set up a CI pipeline, agents, MCPs, and all the rest needed to get vibe coded apps to work become the "simple" way for non-programmers to use computers to mush some data together for their small businesses and neighbors and stuff?
Did spreadsheets, embedded databases, and visual form builders stop working or are lacking in some way?
Or are posts like this astro-turfing LLM posts from companies selling rent to build apps for non-tech folks?
Again, apologize for sounding cynical but it's so hard telling what is genuine these days and I'm genuinely curious how farmers found the time to set up this stuff instead of just using a spreadsheet and a few macros.
If LLMs are covering a gap here maybe there's an opportunity for better, local, lower-tech tooling that doesn't require such a huge tech stack (and subscriptions/rent) to solve simple, tractable problems?
I was a software engineer up till about 8 years ago. I still dabbled in scripts here and there for things I needed since then. LLMs have proved hugely useful for me to do a wide variety of things that wouldn't have been worth bothering with before. The biggest barrier that LLMs overcome for me is being able to quickly find and adapt to different tools, libraries, languages, etc. But it does help immensely to understand how software works to some degree for being able to approach the problem in the first place. I think the two factors multiply together.
I imagine if I want to I could get back into real software engineering much easier and faster than I could have a few years ago, because I still understand how things work fundamentally, I'm just out of date on what's changed in libraries and systems and languages in the last 8 years.
It's also useful for working with spreadsheets and databases.
Anyway I don't mean to shill for LLMs, I hate where this all is taking civilization in general but I'll still use it where it helps me accomplish things I do value.
For example, I use LLMs for one specific thing, making plugins for an app I use (which need to be written in javascript/typescript). No code tools wouldn't be of any use to me here.
No code tools put you in a box that limits what you can create, whereas LLMs allow you to code pretty much anything (though of course how far you can get does depend on having at least some technical ability/knowledge).
I see this with every new technology stack. Way back, we had folks putting out browser "applets" to do the same things that could be done in excel. And then, we had these apps built in the cloud, in mobile, on ios/android, in react, on raspberry pi, on a gpu etc..etc.. ie, Simple apps reinvented with some new tooling. It is almost the equivalent of 'printf("hello world")' when you are learning a new language. This is not to undermine the OPs efforts, but I see it in the spirit of "learning" rather than that of solving a hard problem.
Since OP is a "buffet-nerd", Michael Burry recently equated AI to "escalators" at department stores. Escalators required capital investments but did not provide any competitive advantage to the one deploying it because everyone was deploying it. For many businesses, AI is cost but not competitive advantage.
It's not that programmers should've made tools with training wheels, but that the regular programmer tools exploded in complexity. Microservices, Kubernetes, etc. Not saying those don't have their places, but they've made programming less approachable.
Not really? To someone who doesn't care about software, software is a means to an end of actually doing something, and everything between idea <> execution of value is basically overhead. This has always been true and the overhead is getting carved further and further down over time.
> Since when did "average" people have time to set up a CI pipeline, agents, MCPs, and all the rest needed to get vibe coded apps to work become the "simple" way for non-programmers to use computers to mush some data together for their small businesses and neighbors and stuff?
You don't need all of this. You can basically just download Cursor, the Claude app, Claude code, opencode, whatever today and run something locally. I do think "deployment and productionization" is a bit of a gap but stuff like Replit or even Vercel + Supabase is pretty far along towards agents just being able to do most of infra for you for anything small scale, or at least tell you the buttons to press to hook things up.
> Did spreadsheets, embedded databases, and visual form builders stop working or are lacking in some way?
Pretty much all the LLM/agent products are obviously way ahead of form builders at this point. Take Retool for example, you could spend minutes to hours plugging together "programming-lite" concepts. A single prompt and a few minutes, and maybe 1-2 back and forths can basically get you to the same place with probably less overall jank in a lot of situations. Form-builder stuff is totally dead outside of maybe being an escape-hatch for some LLM situations, or letting users do higher-level scaffolding, but even then I think stuff like Cursor's "select the part of the app you want to change and prompt" is going to be a better UX.
> maybe there's an opportunity for better, local, lower-tech tooling that doesn't require such a huge tech stack
I think you are viewing this from the "tech" angle rather than the deliver value to the end user angle. The tech stack can be arbitrarily complex as long as it works to reduce end user friction and provide value with as much ease as possible. This might as well be the core idea of all consumer tech.
I think your core theses are basically "people care about the underlying tech" and "people want to learn programming or programming-adjacent" and those are both wrong for the vast vast majority of people.
There actually still isn't any money in LLM, but we're in the "cheap ubers" era where everything is subsidized by capital that has congealed thanks to economic deregulation in the 80s. Yay, capitalism.
Now I build all sorts of apps for my farm and organizations I volunteer for. I can pound out an app for tracking sample locations for our forage associations soil sample truck, another for moisture monitoring, a fleet task/calendar/maintenance app in hours and iterate on them when I think of features.
And git was brand new when I left the industry, so I only started using it recently to any extent, and holy hell, is it ever awesome!
I'm finally able to build all the ideas I come up with when I'm sitting in a tractor and the GPS is steering.
Seriously exciting. I have a hard time getting enough sleep because I hammer away on new ideas I can't tear myself away from.
I'm teaching my kid what I consider the AI dev stack: AI IDE (Antigravity for us), database (Supabase for us with a nice MCP server), and deployment (Github and Vercel for us). You can make wonderful little integrated apps with this in hours.
Honestly, I have so many features in it now it's hard to describe it, shared work calendar, parts shopping list, recurring maintenance, blah blah blah. It's very bespoke and I doubt anyone else would want to use it the way we do.
Did you take over a farm?
The only thing Deere "locks down" is that some of the parts have a CANbus address that you need to get a tech over to program the controller to recognize the part, and do the same if you replace a controller.
It's not some nefarious anti-farmer thing, it's because of the way the controller network works. In fact, I've used a CANbus sniffer on the bus and everything on there is in the clear, they don't even encrypt the messages.
The only things I've sent to town to get fixed was because I didn't have time to diagnose it, or it was an insurance claim and I wanted warranty. Blowing $80,000 worth of innards out the back of a combine wasn't a job I wanted to tackle right then (but I probably should have, I wasn't happy with the attention to detail in the repair).
So the upshot is, don't believe every terrible story about Deere you hear. Just the one where they charge too goddamn much for parts.
There is an incredible irony in your typing that out on a device so advanced that it was beyond science fiction when I was growing up 40 years ago.
If you look at Soviet or Chinese Communism, they also stifled innovation, and they also destroyed entire ecosystems. They also had extreme concentrations of power, which allowed psychopathic leaders to commit atrocities.
If we want to come up with real long-term solutions, maybe we need to be honest about underlying human traits, and address those via systematic controls. Otherwise, it feels like we are going to keep bouncing from extreme to extreme. That tendency towards extremes seems like another easily exploited human trait that needs to be identified and addressed.
I guess my point here is that maybe it's not entirely specific systems at fault here, as much as it is universal human traits and group dynamics.
Disclaimer: I thought we had already found the beginnings of an answer, and it was Social Democracy with a regulated market economy. However, this system appears not to be extreme enough for many people to get excited about it.
I'm not sure how you can claim this on the footer of every page when you're vibe coding these calculators.
How confident is the OP that every single one of these 60 calculators work all the time, with all edge cases? Because if someone is on your website using your calculator, they are putting trust in you. If it's wrong, it could have downstream impacts on them. I hope every single one has a comprehensive set of tests with good edge cases. But realistically will they?
I'm actually pretty pro-AI development. But if you're going to use AI to help develop a website, at least focus on quality rather than quantity. AI makes quantity easy, but quality is still hard.
As an aside, the website doesn't even work for me. My clicks don't don anything.
The value isn’t the interface, it’s the trust that its calculations are accurate. I can’t tell you how many meetings I had with accountants and finance people to validate all the calculations.
The compound interest calculator, which is their 'favorite page', already shows an incorrect value in the graph. So my faith in the other calculators isn't great. I also kinda doubt OP's story of them using that page all the time, since it took me about 20 seconds to find this issue.
I have seen that before, so I am not going to touch that.
do you understand how bad it is when you search for software and you cannot trust it to do what you ask of it? it's bad!
The key shift for me: I used to spend hours stuck on syntax, fighting with build systems, or searching StackOverflow for obscure errors. Now that friction is mostly gone. The actual thinking - what to build, how the pieces fit together, what edge cases matter - is still entirely human. But the translation from "I know what I want this to do" to "working code" is dramatically faster.
The compound interest calculator is a good example of something that would've felt like a weekend project a few years ago but probably took you a couple of hours. That's the unlock - not that AI writes code for you, but that the tedious parts stop blocking the interesting parts.
What surprised me most was how much architectural intuition I'd retained even after years away. The fundamentals don't decay as fast as the syntax knowledge.
I’m not an AI hater but I do see this as evidence of LLMs being susceptible to chasing trends as much as people.
Next.js with server rendered React is not a stack that an experienced web developer would have recommended for a “clean” solution to a collection of financial calculators. It’s the answer you’d get if you asked for the stack that’s trending the most lately.
These are all quite reliable well-understood components, and far from "chasing trends" IMO.
While you can't do anything about (other peoples') interfaces, you can absolutely do something for ads. You can install an ad-blocker on your browser. This is not just for you, OP, it's for everyone: get an ad blocker. Your experience of the internet will be radically changed.
I am reminded of this anytime I sit at someone else's computer who doesn't have an ad blocker, or whenever I see internet conversations complaining about ads; I wonder "what ads"? Then I remember: the ads I'm blocking.
So do yourself a big, warm, fuzzy favour and make the internet better for you. Block ads today.
Choose your own ad blocker, obviously.
What, you thought this was an ad for a specific ad blocker, didn't you? Nah, any one will do. Just block bloody ads.
The enshitification of the internet is largely driven by people ad blocking, as is incentivizes more click bait, more ads, and sloppier cheap content.
For engineering/software related content, the impact is immense since the audience is largely people ad blocking. I won't name names, because they fear backlash from their "ad block is awesome" audience, but some well known youtubers in the hard nerdy tech space report 40-50% of views they receive no compensation for.
So you can evangelize how great it is to not have to compensate for content, but don't think it's some kind of everyone wins victory. It's just a cost shift onto someone else, which largely manifests as bad content being needed to cover costs.
The correct approach is paying for what you use, and avoiding ad-supported content to send the message that you want a paid option.
This is unfairly putting the blame on only one rational actor in a prisoner's dilemma.
Content providers are free to put their content behind a paywall with no ads, but they choose not to.
They choose not to because people don't pay for content when they can get it from other providers who don't use a paywall.
Consumers then are left without the option to pay for an ad-free experience.
But ads are run on hardware the consumer owns, consuming their resources and harvesting personal information on the consumer, which is a security concern.
So even if they want to support content creators by viewing the ads they run, they need to also accept the security trade-off, which many reasonably do not
Ok, so maybe they are suscribing to patreon? Maybe Nebula?
Well those two have conversion rates around (on a good day) 1%.
You can swim in the waters of cognitive dissonance because ads really do suck and ad block is a great way to stop the pain while still getting what you want.
Understand though, the statistics are so damning against the ad-block crowd, that you come off like the people screeching about human generated CO2 being totally fine for the environment (It helps plants grow!) because they cannot imagine having to give up commuting in their diesel monster pick-up truck everyday. (Ad block does no damage because I cannot imagine having to see ads...)
As an aside, ironically, security nightmare ads are really only served to people with tracking blockers, because those people are the lowest value visitors and only scammers/bottom feeders really bid on their views. Regular tech illiterate people get ads for Tide and Toyota. The more you know.
The internet is shitty in many ways and ads are one reason. You can pay for ad-free streaming but still get low bitrate although you paid enough to cover traffic costs for higher bitrate. You can pay to have ad-free instagram but still see all this shitty AI-generated crap and bot posts. You can pay for Youtube Premium but Google will still massively invade your privacy.
Do you really think that if everybody turned off their ad blockers and paid for premium services, the internet would become better? The way I see it, corporate greed would milk consumers even more.
Instead of surrendering to ads, we should promote directly donating to (or supporting) YouTubers or websites that provide value to us.
I have a similar experience but its moreso AI lets me build my side projects I only have time to research on, not much time or energy to actually code. I get to review the code and have Claude inspect it (most people I feel dont have Claude do code audits) and tell me where theres bugs, security issues, etc. I do this routinely enough.
The years "away" gave me an unusually clear picture of what problems actually need solving vs what's technically interesting to build. Most devs early in their careers build solutions looking for problems. Coming back after working in a specific domain, I had the opposite - years of watching people struggle with the same friction points, knowing exactly what the output needed to look like.
What I'd add to the "two camps" discussion below: I think there's a third camp that's been locked out until now. People who understand problems deeply but couldn't justify the time investment to become fluent enough to ship. Domain experts who'd be great product people if they could prototype. AI tools lower the floor enough that this group can participate again.
The $100 spent on Opus to build 60 calculators is genuinely good ROI compared to what that would have cost in dev hours, even for someone proficient. That's not about AI replacing developers - it's about unlocking latent capability in people who already understand the problem space.
Feel like forums have turned into a grand Turing Test.
> Domain experts who'd be great product people if they could prototype. AI tools lower the floor enough that this group can participate again.
True, as a threat to PM. Product management can't vibe their way out from a lack of domain expertise.The software paradigm is changing.
People don't need a calculator website anymore. They can just prompt their own AI account to generate whatever calculator they need in the moment. I already have a few pinned in my favorites that I use often.
That is the real promise of AI driven software. Bespoke tiny apps available to anyone whenever they simply just ask for it.
For the foreseeable future until maybe we systems that can predict what someone will need/want for an app at any given time (a prospect as horrifying as it is awesome imo), there’ll be plenty of people, maybe even a majority, that don’t know what they want or need until it’s shown to them.
There will be many more niche applications vibe-coded by people with lots of knowledge and no coding experience/desire that people will use rather than thinking of an app themselves to create.
Then there will be people like you, me, OP and 99% of the other HN community that have a million ideas they want to create, use, and sometimes share.
There are a lot of things I don’t know about and even more I don’t know I don’t know about and in those cases, there’s still a wide open door for people to create applications and experiences that share their knowledge/vision.
I could ask Claude Code or some other future platform to build be a financial calculator every time I need it but why would I do that when someone with the benefit of prior knowledge and experience has already done that for me?
They probably included calculators I didn’t even know I needed.
For myself, I’ve always enjoyed “getting my hands dirty” with code, and the advent of LLMs have been a boon. I’m retired from 34 years of coding (and managing), and never skipped a beat. I’ve released a few apps, since retiring. I’m currently working on the first app that incorporates a significant amount of LLM assistance. It’s a backend admin tool, but I’ll probably consider using the same methodology for more public-facing stuff, in the future.
I am not one to just let an LLM write a whole app or server, unsupervised (I have control issues), but have allowed them to write whole functions, and help me to find the causes of bugs.
What LLMs have given me, is a decreased hesitance to trying new things. I’ve been learning new stuff at a furious rate. My experience makes learning very fast. Having a place to ask questions, and get [mostly] good answers (experience helps me to evaluate the answers), is a game-changer.
> “A ship in harbor is safe, but that is not what ships are built for.” –John A. Shedd
[0] https://littlegreenviper.com/miscellany/thats-not-what-ships...
The author even insists that AI was used because of their poor English, which is the standard excuse on Reddit as well. But clearly, this is not a translation:
> Curious if others have similar stories. Anyone else come back to building after stepping away?
This is bog-standard AI slop to increase engagement.
Look at the blog on their linked site as well. AI-generated posts.
This has been posted here for SEO. This is a business venture.
It's times like this when I think HN needs a post downvote button. Flagging might not be quite appropriate here, but I hate to see this content cluttering up the front page.
Thankfully LLMs are still very stupid. Especially when it comes to security engineering, my specialty, so looks like I have a while yet.
It’s like using a poorly made steel adjustable wrench. People who know how to use it will low-key hate it (but still maybe find it better than no tool at all), whereas people just using it to smash things because heavy will think it’s pretty much the perfect tool.
I don't think it will be; a vibe coder using Gas Town will easily spit out 300k LoC for a MVP TODO application. Can you imagine what it will spit out for anything non-trivial?
How do you even begin to approach remedying that? The only recourse for humans is to offer to rebuild it all using the existing features as a functional spec.
There are cases where that will be the appropriate decision. That may not be every case, but it'll be enough cases that there's money to be made.
There will be other cases where just untangling the clusterfuck and coming up with any sense of direction at all, to be implemented however, will be the key deliverable.
I have had several projects that look like this already in the VoIP world, and it's been very gainful. However, my industry probably does not compare fairly to the common denominator of CRUD apps in common tech stacks; some of it is specialised enough that the LLMs drop to GPT-2 type levels of utility (and hallucination! -- that's been particularly lucrative).
Anyway, the problem to be solved in vibe coding remediation often has little to do with the code itself, which we can all agree can be generated in essentially infinite amounts at a pace that is, for all intents and purposes, almost instantaneous. If you are in need vibe coding disaster remediation consulting, it's not because you need to refactor 300,000 lines of slop real quick. That's not going to happen.
The general business problem to be solved is how to make this consumable to the business as a whole, which still moves at the speed of human. I am fond of a metaphor I heard somewhere: you can't just plug a firehose into your house's plumbing and expect a fire hydrant's worth of water pressure out of your kitchen faucet.
In the same way, removing the barriers to writing 300,000 lines isn't the same as removing the barriers to operationalising, adopting and owning 300,000 lines in a way that can be a realistic input into a real-world product or service. I'm not talking about the really airy-fairy appeals to maintainability or reliability one sometimes hears (although, those are very real concerns), but rather, how to get one's arms around the 300,000 lines from a product direction perspective, except by prompting one's way into even more slop.
I think that's where the challenges will be, and if you understand that challenge, especially in industry- and domain-specific ways (always critical for moats), I think there's a brisk livelihood to be made here in the foreseeable future. I make a living from adding deep specialist knowledge to projects executed by people who have no idea what they're doing, and LLMs haven't materially altered that reality in any way. Giving people who have no idea what they're doing a way to express that cluelessness in tremendous amounts of code, quickly, doesn't really solve the problem, although it certainly alters the texture of the problem.
Lastly, it's probably not a great time to be a very middling pure CRUD web app developer. However, has it ever been, outside of SV and certain very select, fortunate corners of the economy? The lack of moat around it was a problem long before LLMs. I, for example, can't imagine making a comfortable living in it outside of SV engineer inflation; it just doesn't pay remotely enough in most other places. Like everything else worth doing, deep specialisation is valuable and, to some extent, insulating. Underappreciated specialist personalities will certainly see a return in a flight-to-quality environment.
Businesses don't pay for CRUD apps, businesses pay for apps that solve problems which often involves CRUD to persist their valuable data. This is often within the sometimes very strange and difficult to understand business logic which varies greatly from one business to another. That is what "CRUD app developers" actually do, so dismissing them as though there is zero business logic and only CRUD is doing them, us, a disservice.
Why, I do plenty of what you describe myself...
Like 80% of jobs outside the USA are either local or outsourced CRUD web applications. Many people live quite well thanks to exchange rates. I wonder what's gonna happen if/when those jobs disappear.
> If you are in need vibe coding disaster remediation consulting, it's not because you need to refactor 300,000 lines of slop real quick. That's not going to happen.
My experience as a consultant to business is that they only ever bring in consultants when they need a fix and are in a hurry. No client of mine ever phoned me up to say "Hey, there, have you any timeslots next week to advise on the best way to do $FOO?", it's always "Hey there, we need to get out an urgent fix to this crashing/broken system/process - can we chat during your next free slot?".
> Like everything else worth doing, deep specialisation is valuable and, to some extent, insulating.
I dunno about this - depends on the specialisation.
They want a deep specialist in K8? Sure, they'll hire a consultant. Someone very specialist in React? They'll hire a consultant. C++ experts? Consultants again.
Someone with deep knowledge of the insurance industry? Nope - they'll look for a f/timer. Someone with deep knowledge of payment processing? No consultant, they'll get a f/timer.
No, that's fair, and I think you're right about that. But refactoring 300,000 lines 'real quick' isn't going to happen, regardless of that. :)
> They want a deep specialist in K8? Sure, they'll hire a consultant. Someone very specialist in React? They'll hire a consultant. C++ experts? Consultants again.
I implicitly had narrow technical specialisations in mind, albeit including ones that intersect with things like "insurance industry workflows".
That's my worry. Might be put off a few years, but still...
For what I am vibing my normal work process is: build a feature until it works, have decent test coverage, then ask Claude to offer a code critique and propose refactoring ideas. I'd review them and decide which to implement. It is token-heavy but produces good, elegant codebases at scales I am working on for my side projects. I do this for every feature that is completed, and have it maintain design docs that document the software architecture choices made so far. It largely ignores them when vibing very interactively on a new feature, but it does help with the regular refactoring.
In my experience, it doubles the token costs per feature but otherwise it works fine.
I have been programming since I was 7 - 40 years ago. Across all tech stacks, from barebones assembly through enterprise architecture for a large enterprise. I thought I was a decent good coder, programmer and architect. Now, I find the code Claude/Opus 4.5 generates for me to be in general of higher quality then anything I ever made myself.
Mainly because it does things I'd be too tired to do, or never bother because why expand energy on refactoring for something that is perfectly working and not to be further developed.
Btw, its a good teaching tool. Load a codebase or build one, and then have it describe the current software architecture, propose changes and explain their impact and so on.
I have about the same experience as you do and experience using Opus 4.5.
If this is true, you weren’t a very good programmer. There’s much more to code quality than refactoring working code.
Yup, my conclusion exactly.
With that said, most code I have seen in private sector is almost objectively horrible (and certainly subjectively). Code manufactured with the current best tools such as Claude compares favourably. Companies rarely have the patience to pay for well manicured, elegant code. If it sort of works it ships.
A good engineer will tell you how to spend 25% of effort to get to 90% of the result you want. With maintainable code, and importantly with less code that touches fewer systems.
A bad engineer will deliver exactly what product asked for without asking questions, generate 4x the code, and touch every piece of the system.
Companies are just setup in a way that incentivizes building organizations that create bad code. Most places would rather hire 100 bad engineers who can be easily replaced than 5 good engineers.
This is quite true, and it is this -- really, a special case of "the market can remain irrational longer than you can stay solvent" -- that has me worried about the implications for the labour economy more than anything else.
This is a possibility in very well-trodden areas of tech, where the stack and the application are both banal to the point of being infinitely well-represented in the training.
As far as anything with any kind of moat whatsoever? Here, I'm not too concerned.
The way the do, which is? I've skimmed comments and a lot of them is hate, hostility towards OP's project and coders "without skill" in general, also denial because there's no way anything vibe-coded worked. At best, there is strong tribalism on both ends.
You might see more opposing views in this thread, but if you browse this site often you'll see both sides.
Those embracing it heavily do not see the nuances carefully creating maintainable solutions, planning and recognizing tech debt, and where it's acceptable short term. They are also missing the theory building behind what is being created. Sure AI models might get even better and could solve everything. But I think it's naive to think that will be generally good for 90% of the population including people not in tech.
Using these models (text or image) devalues the work of everyone in more than one way. It is harmful for creative work and human expression.
This tech, and a lot of tech, especially ones built by large corporations for profit extraction and human exploitation, is very unlikely to improve the lives at a population level long term. It can be said for a lot of tech (ie. social media = powerful propaganda). The goal of the people creating these models are to not need humans for their work. At which point I don't know what would happen, kill the peasants?
I agree what we do requires a lot of creative thinking. When AI supporters attempt to use an argument comparing to factory workers being freed from dull laborious work by robots, the analogy falls flat on two fronts. First, there's nothing creative about that sort of work and second, because robots are highly accurate; while AI can often be just high.
Also we are still designing systems and have to be able to define the problem properly, at least in my company when we look at the velocity in delivering projects it is barely up since AI because the bottlenecks are elsewhere..
Do you truly believe it won't get better, maybe even better at whole system design and implementation than people?
What are you calling "growth"? Adoption, or LLM progress? LLM progress has objectively slowed down, and for rather obvious reasons. The leaps from GPT-2 to GPT-4 can't be reprised forever.
Literally yesterday I remarked to my tech friends how fun coding with CoPilot is. I actually make forward progress now, and I understand all that the agent is doing.
For me, coding is an enjoyable means to an end. I do enjoy the process, but I enjoy the results more.
You're right though about it not choosing some different path, I might or I might not know that.
They've even got their own slogan: "you're probably just not prompting it properly"
That's the same energy as telling other professions to "just learn to code, bro" once they are displaced by AI.
But I guess it doesn't feel nice once the shoe is on the other foot, though. If nobody values the quality of human art, why should anybody value the quality of human code?
It's the exact same neoliberal elites who told everyone to code one year and told them they'd all be automated of a job the next year.
I dunno who exactly you think you're being condescending towards.
Just like SEO experts, marketing experts, trade bots and crypto experts; the vibe coders will weed out.
It's a miracle. Simply wouldn't have been done before. I think we'll see an explosion of software in small and midsize companies.
I admit it may be crappy software, but as long as the scope is small - who cares? It certainly is better than the janky manual paper processes, excel sheets, or just stuff in someone's head!
Most developers are too full of themselves, in fact, most of us are a bunch of pretentious pricks. It is no wonder people are happy to be able to get what they want without our smugness and pretentiousness. Too bad some us are not like that and will end up getting unemployed anyway in the next few years.
Funnily enough, Excel is the quintessential example of a fourth generation language, IDE, and database and it's the only one aside from SQL which actually succeeded from its time period. It's software, just like what you're building now, and just like what you're building now there are good points and bad points about it. The tradeoffs are different between the JS / Python code you're likely spinning up now vs. the Excel code that was being spun up before, but they rhyme.
And to be honest, even the tiny apps I'm doing I wouldn't have been able to do without some background in how frontend / backend should work, what a relational database is, etc. (I was an unskilled technical PM in the dotcom boom in the 2000s so at least know my way around a database a little. I know what these parts of tech CAN do, but I didn't have the skills to make them do it myself.)
For me, that is nightmare fuel. We already have too much software! And it's all one framework or host app version update away from failure.
1. Invoice billing review. Automated 80% of what was a manual process by providing AI suggestions in an automated way. Saved 3 hours per day of managers time. Increased topline by 10%. Dev time: 1 day
2. Data dashboards. We use janky saas that does not have APIs. Automated a scraper to login, download the reports daily, parse and upload to a database, and build a dashboard. Used to take my associate 3 hours per week to do this in a crappy spreadsheet. Now I have it in a perfect database much more frequently. Dev time: 4 hours.
We are attacking little problems all across the business now.
A MIRACLE!!!!
I wouldn't want to hassle customers who have fully paid up accounts
I think also you need to compare it to what was already there. No QA on the humans. Done off the side of their desk with no oversite, process, or checking. Huge amounts of manual errors.
The new solution just needs to be better than the old one, it doesn't need to be perfect.
(But I 100% agree that I wouldn't let AI live against customers. It is helping us build automations faster, and doing a "little" thinking on recommendation rules that would be very hard to implement without something highly structured, which would be frankly impossible in our environment.)
No. The bar is "miracle" and can cure cancer etc and can replace all developers etc. The bar is much higher than existing manual processes. It absolutely needs to be perfection to match the lofty claims
I guess Vibe coding cleanup firms and offensive security researchers are plotting to find bugs costing firms millions of dollars worth of bugs or one creating a dreadful data breach.
is there a term for that?
AI at our fingertips, accessible and useful, that's just a tool, that's not redefining us as an industry and denying people's jobs – that's an asset. (I used an em dash to prove I am not AI, as apparently double dash is now a sign of AI text!)*
(*) case in point, the situation is _TIRING_.
Im in this field and my system was heavily built with Claude, though not per vibe coding, more like a junior supporting me: I do not see any person connecting a vibe coded bot to a real account soon, since if its about real money, people will hesitate. And if you have blown up one account with your vibe coded bot while you are not a professional dev, you will loose interest very quickly - such systems do not contain "just a few thousand lines of code": Sure you could speed up development massivly and "hit the rock sooner than later" when going vibe coded here :-D
"Comment NEAT to receive the link, and don't forget to connect so I can email you" -- this is the most infuriating line ever.
There are people here "I can finally get all my ideas done!" Sure, if they are really important enough, I guess. But high technology is much, much less important to me than my employer or probably others here on HN. I can only be concerned with the paycheck at this point. And at this point, they are happy that I can read documentation, write code, read documentation, write code, and don't care how it gets done. (For what I am working in though, I'd just skip the AI training step.)
With that in mind, I like to use PLs as tools to clarify thinking. There are others that think using PLs and their accompanying tools as friction to their goals, but my friction is understanding the problems I am trying to solve. So, while taking the adventure into automated tooling might be interesting, it doesn't replace the friction (just the feeling I have to read more potential garbage code.)
In comms, they have something like a 1:4 ratio of design to validation engineers. Defence is slightly different, as it depends on the company, but generally the tolerance for bugs is zero. Lets not get started on the HF trading folks and their risk appetite!
There's a lot of room for software engineers. Most FPGAs are SoC devices now, running some form of embedded linux doing high-level task management networking. Provided you know enough Verilog to know your way around, you'll be fine. You're also in a space where most engineers I know are preparing to retire in the next 5-10 years, so there will be a panic which will ripple across industries.
FPGA basics: https://nandland.com/fpga-101/
Verilog basics: https://hdlbits.01xz.net/wiki/Main_Page
Projects: https://www.hackster.io/fpga/projects
Marvel Champions in particular is a lot of fun, although may be a bit overwhelming at first if you don't play a lot of board games already.
I also got into Legendary deckbuilding games recently, and those are a bit more approachable, although not all of them play solo unless you manage two hands of cards (which isn't a big deal for me, but I've played hundreds of different board games).
They have those based on various IPs (Game of Thrones, James Bond, X-Files, Matrix, Alien movies, Buffy, Marvel, and in a few months DC comics) and play somewhat similarly, so if you learn one it would be easy to learn another one.
I also picked up a solitaire variant called Hoki just last week and really enjoyed it. You upgrade your cards over multiple games (that are each about five minutes to play), and then once you've completely upgraded all the cards you can play the game daily and then consult a book that will give you a fortune based on the final state of your game.
It took me 53 games to unlock the final state, and I did all of them in just a couple of days, I enjoyed it so much. Now I'm playing a game or two a day to see what the fortune is and then writing a journal to reflect on what that could mean, for fun.
Slowly getting back into my creative hobbies this year (which include board game design and writing), although coding I still feel is hard to do in my off time (even when it's making games, which I've historically really enjoyed doing).
I've messed around with A.I. agent coding a bit, and I'm a bit more impressed with it than I anticipated, but I'm not sure how deep down that rabbit hole I want to go and not code myself. But I really don't feel like I have much energy left in the tank for coding more after doing it for my day job lately.
I am of the same age. I have some good ideas on where to go, but dread the grind to get things moving. When I was in my teens and 20s the grind that got me to where am now was fun, but doing it again looks far less appealing now.
Security engineers will have jobs until software is perfectly secure... and that is going to be a while.
I do not use LLMs at all to do my job, and it is unlikely I ever would. Clients pay me -after- they had all their favorite LLMs take a pass.
And indeed the vibe coders will just create a lot more security issues
Not as long as you think.
https://cybernews.com/security/standord-artemis-system-beats...
Might be never or if the software is not used at all.
The perfect and secure software is none.
Well, at least not connected to the internet?
A lot of work was tedious, painstaking grind, but the reward at the end was considerable.
AI has completely annihilated all of the joy I got out of the process, and everything that attracted me to it with such abandon as an adolescent and a teenager. If someone had told me it was mostly slop curation, I would have stayed in school, stuck to my philosophy major, and who knows -- anything but this. I'm sure I'd have got reasonably far in law, too, despite the unpropitious time to be a JD.
I'm still working on my own small closed source projects, building them the way I want to, like a gameboy emulator - and I've gotten a lot of joy from those.
No matter how 'senior' you are, when you lose touch with the code, you will, slowly, lose the ability to audit what LLMs spit out, while the world moves on. You got the ability to do that by banging your head against code the hard, "pre-AI" way, perhaps for decades, and if you don't do the reps, the muscle will atrophy. People who think this doesn't matter anymore, and you can just forget the code and "embrace exponentials" or whatever, are smoking the good crack; it _is_ about the code, which is exactly why LLMs' ability to write it is the object of such close examination and contestation.
Folks who realise this will show to advantage in the longer run. I don't mean that one shouldn't use LLMs as an accelerant -- that ship has sailed, I think. However, there is a really good case to be made for writing a lot by hand.
One day I might start a consultancy business that only does artisanal code. You can hire me and my future apprentices to replace AI code with handcrafted code. I will use my company to teach the younger generation how to write code without AI tooling.
That's an interesting perspective. I guess it depends on what you want and how low the stakes are. Artisanal coffee, sure. Artisanal clothing, why not? Would you want an artisanal MRI machine? Not sure. I wouldn't really want it "hand crafted", I just want it to do it's job.
These existed before but the culture surrounding AI delivered a double dose of both.
I have no problems with LLMs themselves or even how they are used but it has developed its own religion filled with dogma, faith based reasoning and priests which is utterly toxic.
The tools are shoved down our throats (thanks to the priesthood, AI use is now a job performance criteria) and when they fail we are not met with curiosity and a desire to understand but with hostility and gaslighting.
If this were about grammar, it would be appropriate to translate something you wrote, not use generative AI to create it.
This whole thing is an ad. All the post's sentiments that people are engaging with ("imposter syndrome" etc.) were spit out by a clanker.
What a disheartening start to my morning.
Creating a polished, usable app is just so much work, and so much of it isn't fun at all (to me). There are a few key parts that are fun, but building an intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
I'm bewildered when I read posts by the naysayers, because I'm sitting here building polished apps in a fraction of the time, and they work. At least much better than what I was able to build over a couple of weekends. They provide real value to me. And I'm still having fun building them.
I now vibe coded three apps, two of them web apps, in Rust, and I couldn't write a "Hello World" in Rust if you held a gun to my head. They look beautiful, are snappy, and it being Rust gives me a lot of confidence in its correctness (feel free to disagree here).
Of course I wouldn't vibe code in a serious production project, but I'd still use an AI agent, except I'd make sure I understand every line it puts out.
> in a serious production project, but I'd still use an AI agent, except I'd make sure I understand every line it puts out.
That isn't going to cut it. You need to understand the problem domain, have a deep design taste to weigh current and future demands, form a conceptually coherent solution, formalize it to code, then feed back from the beginning. There is no prompt giving your AI those capabilities. You end up with mediocre solutions if you settle for understanding every line it spits out. To be fair, many programmers don't have those capabilities either, so it also a question of quality expectations.I believe you can use LLMs as advanced search and as a generator for boilerplate. People liking it easy are also being easy with quality attributes, so anyone should be self aware where they are on that spectrum.
Then don’t do it. No one is forcing you. Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you? Not everything needs to be or should be dumbed down to appeal to lowest common denominator.
Alternatively, go work at a company where you’re part of a team and other people do what you do not enjoy.
> I'm sitting here building polished apps in a fraction of the time
No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
The airplane company wont let you vibe code their systems anyway, and rightly so. the rest of us can just do whatever we like.
No one is also keeping me from doing what I want to spend my time with on my days off.
> Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you?
No, because this isn't remotely comparable to weekend hobby projects. What a weird question.
> No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
I guess we have different definitions of "polished" then.
I agree. But those also don’t need:
> intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
Some of that, sure, but not all of it. Either it’s a weekend hobby project or it’s not, and your description is conflating both. A hobby is something done for fun.
Of course I wouldn't vibe code in a serious production project, but I'd
still use an AI agent, except I'd make sure I understand every line it
puts out.
So you value your ability to churn out insignificant dreck over the ability of others to use the internet? Because that's the choice you're making. All of the sites that churn your browser for a few seconds because they're trying to block AI DDoS bots, that's worth your convenience on meaningless projects? The increased blast radius of Cloudflare outages, that's a cost with foisting on to the rest of the internet for your convenience?Thanks.
thats why it was valuable.
All things worth doing are hard.
I always have a hard time taking this complaint seriously, because the solution is absolutely trivial. Write a snippet. Have you really been out there, year after year, rewriting the same shit from scratch over and over? Just make a snippet. Make it good and generic and save it. Whenever you need to do something repeated on a new project, copy it (or auto-expand if you use it that often) and adapt. Snippet managers are a thing.
I loaded the lowest level piece of software I wrote in the last 15 years - a memory spoofing aimbot poc exploiting architectural issues in x86 (things like memory breakpoints set on logical memory - not hw addresses - allowing to read memory without tripping kernel-level detection tools, ability to trigger PFs on pages where the POC was hiding to escape detection, low level gnarly stuff like this). I asked it to clean up the code base and propose why it would not work under current version of windows. It did that pretty well.
Lower level stuff does of course exist, but not a whole lot IMHO. I would not assume claude will struggle with kernel level stuff at all. If anything, this is better documented than the over-abstraced mainstream stuff.
The cost of hallucinations though - you potentially have a stronger point there. It wouldn’t surprise me if that fails to sway some decision makers but it doesn’t give the average dev a bit more ground to work with.
Like, get real!
I feel like I'm rapidly going insane. It wasn't that long ago when many people in this forum would boldly exclaim that their software development skills were their capital and take pride in their ability to build stuff. It also wasn't that long ago when the "ideas guys" were a meme here.
We're ceding almost all of our bargaining power because programming "was never valuable." And we're doing it with smiles from ear to ear.
It's not so great for the one or two but fantastic for everybody else.
I can now do more with AI tooling so I enjoy that more.
I know lots of you enjoy coding for its own sake, more power to you. But it's no surprise to me that many (most?) view it as a means to an end.
With AI, you are no longer a developer, you're a product manager, analyst, or architect. What's neat about this, from a business perspective, is that you can in effect cut out all your developers and have a far smaller development workforce consisting of only product managers, analysts, and architects whom you call "developers" and pay developer salaries to. So you save money twice: once on dev workforce downsizing, and again on the pay grade demotion.
I'm currently exploring domain-specific languages aimed at writing web applications. I've been particularly interested in, much like bash, data flowing through pipelines. I have spent quite a bit of time and I'm definitely not vibe coding but I've probably only writen 1-2% of the code in these projects.
It is so much work to build out a new language with a surrounding ecosystem of tooling. Not even five years ago this would have necessarily been a full time multi-year endeavor or at least required a team of researchers. Now I can tinker away in my off hours.
This is what I am exploring:
https://williamcotton.com/articles/the-evolution-of-a-dsl
Did I not craft the syntax and semantics of these languages?
What about the phone in your hand, did you design that?
HN loves to believe they are the noble few - men and women of math and science, driven by nothing but the pure joy of their craft
But this whole AI thing has been super revealing. Almost everyone here is just the same old same old, only that now that the change is hitting close to home, you’re clutching your pearls and lamenting the days when devs were devs
The younger generation born into the AI world is going to leave you in the dust because they aren’t scared of it
My math teacher used to say that people felt this was about…calculators, imagine that
If that's the tradeoff a business wants to make, that's their call. But AI-assisted development really is just outsourcing with a bigger carbon footprint.
There are people who would code whether it was their career or not, I'm not one of those people. I fell into software development in order to make money, if the money stopped then I would stop. I love building and selling products, if I can't do that then I have no interest in programming. I'm interested in machines, CPU's, etc. I'm interested in products, liaising with customers, delivering solutions, improving things for users, etc. You think there is no distinction there? Again, there are people who code for fun, I'm simply not one of them...
maybe "MBA news" would be better suited?
Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term. The reverse is dangerous too, but can be offset to a certain extent with good product management.
This is solved problem with any large, existing, older code base. Original writers are gone and new people come on all the time. AI has actually helped me get up to speed in new code bases.
Is this also true of all third party code used by their solution? Should they make all libraries and APIs they use their own in exactly in the form it needs to be according to their deep expertise? If not, why not?
If so, does this extend to the rest of the stack? Interpreters, OSes, drivers? If not, why not?
This isn't a trick question, BTW. It's a genuine attempt to get to the rationale behind your (and the GP's) stance on this.
In particular, the GP said:
> Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term.
That hints I think at their rationale, that their stance is based on placing importance on the parts of software development that they enjoy, rather than any logical basis.
This happens, but very rarely compared to changes in your own code base. If a library breaks, you can usually find an alternative, but even in that case you need to know how to modify your own code.
The difference with generated code is that you are tasked to maintain the generated code.
I don't think this is true, but say we accept it.
> The difference with generated code is that you are tasked to maintain the generated code.
Is this a task that LLMs are incapable of performing?
That's what people tend to report, yes.
I think for a lot of minor things, having AI generate stuff is okay, but it’s rather astounding how verbose and sometimes bizarre the code is. It mostly works, but it can be hard to read. What I’m reading from a lot of people is that they’re enjoying coding again because they don’t have to deal with the stuff they don’t want to do, which...I mean, that’s just it isn’t it? Everyone wants to work on what they enjoy, but that’s not how most things work.
Another problem is that if you just let the AI do a lot of the foundational stuff and only focus on the stuff that you’re interested in, you sometimes just miss giant pieces of important context. I’ve tried reading AI driven code, sometimes it makes sense, sometimes it’s just unextensible nonsense that superficially works.
This isn’t tech that should replace anything and needs to be monitored judiciously. It can have value, but what I suspect is going to happen is we are going to have a field day with people fixing and dealing with ridiculous security holes for the next decade after this irrational exuberance goes away. It should be used in the same way that any other ML technique should be. Judiciously and in a specific use case.
Said another way, if these models are the future of general programming, where are the apps already? We’re years into this and where are they? We have no actual case studies, just a bunch of marketing copy and personal anecdotes. I went hunting for some business case studies a while ago and I found a Deloitte “case study” which was just pages of “AI may help” without any actual concrete cases. Where are the actual academic studies showing that this works?
People claiming AI makes them code faster reminds me that Apple years ago demonstrated in multiple human interaction studies that the mouse is faster, but test subjects all thought keyboard shortcuts were faster [1]. Sometimes objective data doesn’t matter, but it’s amusing that the whole pitch for agentic AI is that it is faster and evidence is murky for this at best.
More likely that step is just skipped and replaced with thoughts and prayers.
This is such marketing speak. The words mean nothing, they’re just a vague amalgamation of feelings. “Vibes”, if you will.
If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
> The happy consumer and the polished product
More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection. Letting an LLM spit out code you just accept is not it.
The word you’re looking for is “shiny”, meaning that it looks good at a glance but may or may not be worth anything.
I get the argument. Sometimes I really enjoyed the actual act of finally figuring out a way to solve a problem in code, but most of the time it was a means to an end, and I'm achieving that end far more often now via AI tooling.
I’m not fussed about the exact term, as long as it points to something real and at semantic equal footing with the alternative.
Note how they described two areas of focus (what you “love”): “coding” and “delivering value/solutions”.
You can be a “coder” or a “programmer”, no one is a “deliverer of value/solutions”.
“Coding” is explicit, it’s an activity you can point at. “Delivering values/solutions” is vague, it’s corporate speak to sound positive without committing to anything. It doesn’t represent anything specific or tangible. It doesn’t even reference software, though it’s what it is, to make it sound broader than what it is. You could say “using and releasing apps”, for example, thought proponents may feel that’s reductive (but then again, so is “coding”).
Again, what’s in contention here isn’t the exact term, but making sure it’s one that actually means something to humans, instead of marketing speak.
I actually think this reveals more about you than you might realise. A _lot_ of people enjoy being able to help people resolve problems with their skills. Delivering value is marketing speak, but it's specifically helping people in ways that's valuable.
A lot of people who work in software are internally motivated by this. The act of producing code may (or may not be) also enjoyable, but the ultimate internal motivation is to hand over something that helps others (and the external motivation is obviously dollars and cents).
There is also a subset of people who enjoy the process of writing code for its own sake, but it's a minority of developers (and dropping all the time as tooling - including LLMs - opens development to more people).
> If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection.
You can say the same thing about libraries, interpreters, OSes, compilers, microcode, assembly. If you're not flipping bits directly in CPU registers, your not pouring over every little detail to ensure perfection. The only difference between you and the vibe coder who's never written a single LoC is the level of abstraction you're working at.
Edit:
> If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
I also think this says maybe a lot about you, also, as many people also donate their time and efforts to others. I think it may be worth some self-reflection to see whether your cynicism has become nihilism.
I did use to volunteer at a food bank, but I used that example only because it’s quick and simple, no shade on anyone who doesn’t. I stopped for logistical reasons when COVID hit.
I have used the set of skills I’m god at to help several people with their goals (most were friends, some were acquaintances) who later told me I changed their life for the better. A few I no longer speak to, and that’s OK.
Oh, and before I became a developer, I worked in an area which was very close to marketing. Which was the reason I stopped.
So yeah, I know pretty well what I’m talking about. Helping others is an explicit goal of mine that I derive satisfaction from. I’d never describe it as “delivering value/solutions” and neither would any of the people I ever helped, because that’s vague corporate soulless speech.
How do you feel about the fact that OpenAi et al have slurped up all your code and are now regurgitating it for $20/month?
I also don’t think “but it wouldn’t be viable otherwise” is a valid defence.
I don’t see what that has to do with the conversation, though. If your point is about the free/$20, that doesn’t really factor into my answer.
While I commend your voluntary efforts, I don't think it lends any more weight to your original comment. In fact, I think this comment highlights a deep cynicism and I think a profound misunderstanding of the internal motivations of others and why "delivering value" resonates with others, but rings hollow to you.
In the end, this debate is less about LLMs, and more about how different developers identify. If you consider software to be a craft, then mastery of the skillset, discipline, and authorship of the code is key to you.
If you consider software to be a means to an end, then the importance lies in the impact the software has on others, irrespective to how it's produced.
While you are clearly in the former camp, it is undeniable that impact is determined entirely by what the software enables for others, not by how it was produced. Most users never see the code, never care how it was written, and judge it only by whether it solves their problem.
A street sweeper “delivers value” in the form of a clean street. A lunch lady at a school “delivers solutions” in the form of reducing hunger in children.
There’s nothing wrong with wanting to do something for others, the criticism is of the vague terminology. The marketing speak. I’ve said that so many times, I’d hope that’d been clear.
> While you are clearly in the former camp
You’re starting from wrong assumptions. No, I’m not “in the former camp”, I find the whole premise to be a false dichotomy to begin with. Reality is a spectrum, not a binary choice. It’s perfectly congruent to believe a great product for customers is the goal, and that the way to achieve it is through care and deliberate attention to the things you do.
This isn’t a critique of language - it’s a category error. You’re confusing the mechanism with the purpose.
In your examples, a street sweeper or lunch lady (Google says this is an antiquated US term for canteen worker?) do indeed deliver value, clean streets and nourished students. That's the value they're paid to provide. Those are the outcomes we care about, and whether the sweeper uses a broom or Bucher Citycat is only of interest in that one allows the sweeper to provide more value at lower cost, eg more metres of clean road per dollar.
The same is true of the canteen worker, who may use Rationales and bains marie to serve more hot meals at lower cost than cooking each meal individually.
> You don’t “deliver solutions”, you write software (or have it written for you).
Saying you "write software", not deliver solutions actually indicates that you don't understand the profession you're in. It mistakes the process for the outcome. Writing code is one means among many for achieving an outcome, and if the same outcome could be achieved by the business without software, the software would be dropped instantly. Not because care doesn’t matter, but because the purpose was never the code itself.
> It’s perfectly congruent to believe a great product for customers is the goal, and that the way to achieve it is through care and deliberate attention to the things you do.
But according to you, care and deliberate attention (software as craft) are the only way. An absolutist position. But most software that matters is imperfect, build over time, touched by many hands, and full of compromises. Yet it still delivers enormous value. That’s evidence that outcomes, not purity of process, is what delivers value and defines success in the real world.
Early in my career I was called in a number of times to write software for some business process. Many times after talking to the users and understanding the process, I would recommend against any software. It wasn't needed, or the time could be spent in better ways (AI is likely changing that calculation though). IIRC, my title was even 'Solutions Provider' or some such. I love writing software, but it's always been a means to an end for me.
No! That is not what I’m saying! How can you argue my position is an absolute when I just explicitly described it as a spectrum?!
However, I do believe you’re arguing in good faith, I just don’t think we’re on the same page. I wish we were, as while I think we might still disagree, I also believe we’d have an interesting conversation. Probably more so in person.
Unfortunately, I have to go get some work done so I’m unable to continue as of now. Still, instead of leaving you hanging, I wanted to thank you for the respectful conversation as well as your patience and I believe genuine effort in trying to understand my position.
I'm extremely diligent around vetting all code in my repo's. Everything is thoroughly tested and follows the same standards that were in my codebase before the invention of LLM's. I'm not "vibe coding". You're making assumptions because of your negative emotional reaction to LLM's.
Do you see why that’s marketing speak? You’re using vague terms which can be applied to anything. It avoids commitment and makes whatever you do seem grandiose. That’s marketing.
A few years ago, every app developer and designer was a “story teller”.
You don’t “deliver solutions”, you write software (or have it written for you).
Yes, it's exactly the same. Is your problem the fact that this gets you off the high horse?
> More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”.
This doesn’t make any sense. Polished to who? The end user? You can absolutely use AI to polish the user experience. Whether coding by hand or AI the most important aspect of polish is having someone who cares.
Can't the customer now just skip you and generate a product for himself via AI?
edit: to stay on the larger topic, I haven't been swayed much one way or the other. ~90% of the code I need existed a decade ago in the form of reusable modules. Anything new is closer to pseudo-code, an amplifier or sandbox isn't something I'm that interested in.
I think AI-augmented development will lead to faster and vastly improved software over the years. This isn't just a space that's being disrupted on the maker/creator side of developing software. And from a makers/creators point of view, you wouldn't even need to keep up with the latest trends like performance, AI should just know which libraries are the best to use to develop your solutions.
I like using my software engineering skills to solve people's problems. I don't do coding for it's own sake - there's always a thing I'm trying to implement for someone.
Perhaps if we didn’t have deep layer cakes of frameworks and libraries, people would feel like they can code with or without AI. Feels like AI is going to hinder any efforts to address complexity and justify us living with unnecessary complexity simply because a machine can write the complex, hard to understand, brittle code for us.
No it didn't, in fact, your job shifted from code writer to code fixer
As others have pointed out, I'm looking at a career shift now. I'm essentially burning out on doing the whole LLM-assisted coding stuff while I still can, earning money on contracts, and then going to step away from the field. I'm lucky that I'm in a position to do so, but I really don't know what the rest of my career looks like.
I miss that level of mastery. I feel that in the LLM-assisted coding age, that's now gone. You can read every section of code that an LLM generates, but there's no comparison to writing it by hand to me in terms of really internalizing and mastering a codebase.
On the positive side, I have some old personal projects I couldn't complete because it was too much work for me alone. I think LLMs will help for menial tasks, while I can still work on improving the design and adding features.
Everyone else is using LLMs to assist their development, which makes it a lot harder to work without them, especially in just building enterprise apps. It doesn't feel like I'm creating something anymore. Rather, it feels like a fuzzy amalgamation of all developers in the training data are. Working with LLMs sometimes feels like information overload. When I see so much code scrolling past as the agent makes its changes, this can be exhausting. Reading this massive volume of code is exhausting. I don't like that the new "power tools" of software engineering mean that my career, our career, is now monetizable. I liked feeling like a craftsman, and that is lost.
I'm lucky to live in the Research Triangle area of the United States, so I've got really good options for schooling around me. My sister graduated with an aerospace engineering degree, and I've always been interested in space. Thinking about hardware as a possible path as well.
But in a complete twist, I've also always wanted to be an educator. A high school math or computer science teacher would fit me well. I remember a lot of my male teachers very fondly in terms of the impact they had on my life, and I'd love to give that back.
I've been considering becoming an electrician but it is also quite a career shift.
You improve over time. I've been programming for 6 years and I still feel like I'm nowhere near others. That's a completely fine and valid thing to feel.
That's creating a new inefficient, socially destructive, environmentally damaging hammer because solving the real problem doesn't sell well.
I'll be happy when we solve THAT problem.
Can definitely understand the reluctance people feel around it. Especially when they’ve invested years into it and have their livelihood on the line
I’m also quite reluctant to publish any of it. Doesn’t feel right to push code I don’t fully understand so mostly personal projects for now
Nit: it seems like the graph for the compound interest calculator should start at year 0 rather than year 1.
Also, it might be nice to have a way to change the starting year to the actual year you want to start (such as the current year).
I have always found management to be just silly exercise in day full of meetings. I like to make things. I could retrain, but, the salary drop would be very hard. Hope to find one last gig and have enough to retire. I still get that spark of joy when all the tests pass.
Many who are considering a career shift away from software due to 'AI disgust' devoted their lives to developing software because they loved the craft. But with AI churning out cheap, ugly, but passable code, it's clear that businesses never appreciated the craft. I hope these folks find an area outside of SWE that they love just as much.
But once these folks find this area, it would be naive to think they won't use software to scratch their itch. In the same way that people who didn't pursue a career in SWE (because they felt under-qualified) are using AI to solve their problems, these folks will now find their own problems to solve with software, even if at first that is not their intention. They probably won't use AI to write the code, but ultimately, AI is forcing everyone to become a product manager.
But what if the business is soulless? As in what if the business you're working on is just milking value out of people through negative patterns which... is ... well a lot of tech businesses these days. Maybe the busywork enabled engineers to be distracted from the actual impact of their work which makes people demotivated.
I'm also now dealing with things that previously would have taken me too long to deal with. For example, I'm actually making a dent in the amount of technical debt I have to deal with. The type of things where previously I maybe wouldn't have taken a week out of my schedule to deal with something that was annoying me. A lot of tedious things that would take me hours/days now can get done in a few prompts. With my bigger projects, I still do most stuff manually. But that's probably going to change over the next months/year.
I'm mainly using codex. I know a lot of people seem to prefer Claude Code. But I've been a happy ChatGPT Plus user for a while and codex is included with that and seems to do the job. Amazing value for 20$/month. I've had to buy extra credit once now.
The flip side of all this is that waiting for AI to do it's thing isn't fun. It's slow enough that it slows me down and fast enough that I can't really multi task. It's like dealing with a very slow build that you have to run over and over again. A necessary evil. But not necessarily fun. I can see why a lot of developers feel like the joy is being sucked out of their lives.
Dealing with this pain is urgent. Part of that is investing in robust and fast builds. Build time competes with model inference in the time stuff takes. And another part is working on the UX of this. Being able to fork multiple tasks at once is hugely empowering. And switching between editing code and generating code needs to get more seamless. It feels too much like I'm sitting on my hands sometimes.
AI is eroding the entry barrier, the cognitive overload, and the hyper-specialization of software development. Once you step away from a black-and-white perspective, what remains is: tools, tools, tools. Feels great to me.
Cool project!
Otherwise it feels deceptive. Which is surprising given we should judge off intentions and not augmentation (like come on guys this is HN FFS).
This guy's not running any ads on the site, hasn't spammed with multiple posts that I've seen. I still think investment funds/modern stock exchanges are needless parasites upon society but that's just my opinion.
I'll figure out a better way. Thanks for calling it out.
Things are definitely changing around HN compared to when it first started.
It's impossible to tell if this is AI or not. Another version of Poe's law. The only thing to do is assume everything is AI, just like you must assume all posts have ulterior (generalluy profit-driven) motives, all posters have a conflict of interest, etc.
Maybe the only thing to do is stop trying to understand posters' motivations, stop reading things charitably, stop responding, just look for things that are interesting (and be sure to check sources).
Every spammer and scammer, even a bot, is ultimately controlled by a real person in some sense. That doesn't mean we want their content here.
Anyone who disagrees with the above are just hurt that their manual hyping has been replaced with machines.
OP made a site with a bunch of calculators. Their critics didn’t make that!
It's cool that ChatGPT can stitch these toys together for people who aren't programmers, but 99% of software engineers aren't working on toys in the first place, so we're hardly threatened by this. I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
"Software engineering" doesn't matter to anyone except to software engineers. What matters is executing that idea that's been gathering dust for ages, or scratching that pain point that keeps popping up in a daily basis.
My response is perhaps a bit raw, but so is the quote above.
Stop with the gate keeping. I've studied CS to understand coding, not to have some sort of pride to build "real software". Knowledge is a tool, nothing more, nothing less.
There are enough developers whose whole job it is to edit one button per week and not much more. And yes, there are also enough developers that actually apply their CS skills.
> but 99% of software engineers aren't working on toys in the first place
Go outside of your bubble. It's way more nuanced than that.
> I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
Moving goal posts. Always has been.
It's not that I fully disagree with you either. And I'm excited about your accomplishments. But just the way it reads... man...
I guess it hits me because I used to be disheartened by comments like this. It just feels so snarky as if I am never good enough.
The vibe is just "BUH BUH BUH and that's it." That's how it comes across.
And I've come to mature enough to realize I shouldn't feel disheartened. I've followed enough classes at VUSEC with all their rowhammer variations and x86-64 assignments to have felt a taste of what deep tech can be. And the thing is, it's just another skill. It doesn't matter if someone works on a web app or a deep game programming problem.
What matters (to me at least) that you feel the flow of it and you're going somewhere touching an audience. Maybe his particular calculator app has a better UX for some people. If that's the case, then his app is a win. If your game touches people, then that's a win. If you feel alive because you're doing complex stuff, then that's a win (in the style of "A Mathematician's Apology"). If you're doing complex stuff and you feel it's rough and you're reaching no one with it, it's neutral at best in my book (positive: you're building a skill, negative: no one is touched, not even you).
Who cares what the underlying technology is. What's important is usability.
Feel free to point out where I moved goal posts. To say that I moved goal posts would imply that at one point I stated that creating a trivial website was software engineering. If you're comparing my statement to what some other person said, who made arguments I did not make, then we cannot have any kind of constructive dialogue. At that point you are not talking to me, but talking to an imaginary projection of me meant to make yourself feel better about your argument.
> Stop with the gate keeping.
I'm not gatekeeping anything. You can disagree with my descriptive terms if you want, but the core point I'm trying to get across is: what people are doing with Claude can not replace what I do. I would know, I've tried extensively. Development is a lot of hard work and I would love it if my job were easier! I use LLMs almost every day, mostly for trivial tasks like reformatting text or writing advanced regex because I can't be bothered to remember the syntax and it's faster than looking it up. I also routinely pose SOTA models problems I'm working on to have them try to solve them, and I am routinely disappointed by how bad the output is.
So, in a thread where people were asserting that critics are merely critics because they're afraid of being replaced I pointed out that this is not factually correct, that no, we're not actually afraid of being replaced, because those of us who do "real" engineering (feel free to suggest a different term to substitute for "real" if the terminology is what bothers you) know that we cannot be replaced. People without experience start thinking they can replace us, that the exhilarating taste of coding they got from an LLM is the full extent to the depth of the software engineering world, but in fact it is not even close.
I do think that LLMs fill a useful gap, for projects where the time investment would be too large to learn to code and too unimportant to justify paying anyone to program, but which are simple enough that a non-engineer can have an LLM build something neat for themselves. There is nothing wrong with toys. Toys are a great thing to have in the world, and it's nice that more people can make them[1]. But there is a difference between a toy and what I do, and LLMs cannot do the thing I do. If you're taking "toy" in a derogatory manner, feel free to come up with another term.
[1] To some extent. While accessibility is generally a great thing, I have some misgivings. Software is dangerous. The web is arguably already too accessible, with frameworks enabling people who have no idea what they're doing to make professional-looking websites. These badly-made websites then go on to have massive security breaches that affect millions of users. I wish there was a way to make basic website development accessible, whether through frameworks or LLMs, in a way that did not give people using them the misplaced self-confidence to take on things way above their skill level at the cost of other people's security.
What’s even the point of writing out that first paragraph otherwise?
I was correcting your misguided statement:
> Their critics didn’t make that!
by pointing out that we, among other things, build the libraries that you/Claude are copy-and-pasting from. When you make an assertion that is factually incorrect, and someone corrects you, that does not mean they are threatened.
If you did, did you put yourself in a clean room and forget about every existing library you’ve ever seen?
Have you made sure your code doesn’t repeat anything you’ve seen in a CS101 textbook? Is your hello world completely unique and non-identical to the one in the book?
When you write a song do you avoid using any chord progression that has been used by someone else?
LLMs are just doing a dumbed down version of human information processing. You can use one to make an app and tell it not to use any libraries. In fact, I’d argue that using an LLM negates the need for many libraries that mostly serve to save humans from repetitive hand-writing.
You can even tell AI to build a new library which essentially defeats your entire argument here. Are you trying to imply that LLMs can’t work at an assembly language level? I’m pretty sure they can because they’ve read every CS textbook you have and then some.
Will it be quality work? The answer to that question changes every day.
But the fact remains that you are indeed acting threatened. You’re not “correcting” me at all, because I didn’t claim that AI-assisted developers are doing anything in some kind of “pure” way.
My claim is that they’re seeing something they want to exist and they’re making it exist and putting it out there, while the vast majority of haters aren’t exactly out there contributing to much of anything in terms of “real software engineering.”
Imitation is a form of flattery. When something “copies” you and makes it better/cheaper/more customized, that’s a net gain. If AI is just a fancy copy machine, that functionality alone is a net benefit.
I'll keep learning and try to make this less of a toy over time. And hopefully I can bring what I've learned from years in investing into my next product to actually help people. Thanks for the perspective.
And if you are thinking enterprise, it would take 2-3 developers, 2 analysts, 2 testers, 1 lead and 1 manager 2-3 months to push something like this. (Otherwise why would lead banks spent billions and billions for IT development every year? What tangible difference you see in their website/services?)
5000 calculators may look excessive, but in this case it magnifies the AI capabilities in the future - both in terms of quality and quantity.
Well, I don't think all those people are spending their time making simple calculators.
What can I say... If you used a calculator to get an answer for sqrt(2) are you back to doing mathematics? It's simpler and more fun instead of using Newton method. But it's debatable if you are actually working on mathematics problems.
For the same reason things like Image Playground/etc seem magical/appealing to non-artists (myself included): we don't know how to do it ourselves, so it feels empowering.
Or more close to home: it's the same reason that developers are so in love with clicking some buttons in the <insert cloud mega provider> dashboard in spite of the costs, lock-in, more costs, yet more costs, and of course the extra costs.
As with those choosing "cloud" services they don't need, here too there will no doubt be a lucrative market to fix the shit once people realise that there's a reason experts charge the way they do.
$100 seems like a lot. I guess if you think about it compared to dev salaries, it's nothing. But for $10 per month copilot you can get some pretty great results too.
Have you tried this? https://www.investor.gov/financial-tools-calculators/calcula...
Every other day I see ads of companies saying "use our AI and become a millionaire", this kind of marketing from agentic IDEs implies no need for developers who know their craft, which as said above, isn't the case.
If that’s the bar, there likely a ton of businesses that should shut down…
this by definition filters out all non-devs, even many junior devs as you need to understand deeply if those tests are correct and cover all important edge cases etc.
+ when you deploy it - you need to know it was properly deployed and your db creds are not on frontend.
But mostly no one cares as there is no consequences to leaking personal data of your users or whatnot.
If you just want to build a little web app, or a couple of screens for your phone, you'll probably be fine. (Unless there's money or personal data involved.) It's empowering! Have fun.
But if you're trying to build something that has a whole bunch of moving parts and which isn't allowed to be a trash fire? Someone needs to be paying attention.
Edit: I appreciate the quick turnaround. Apologies.
Did fucking AI also write your article?
But these are the kinds of things that pretty much general purpose AI can just oneshot in a single prompt now.
For example, the other day I wanted to know how much caffeine I was taking in based on my coffee intake. So I asked Claude to just build me an app where it would show my current caffeine "load" in my system, and increase it when I pushed a button with the volume of the coffee, and even had real-time decay of the amount of caffeine in my system. One shot.
Anyone can just get these kinds of things made for themselves on-demand. We don't need nice apps anymore, because now software is completely disposable and customized per person. So what is the point of even building these kinds of "fun" tools anymore? Feels like we are essentially doomed to only churn out AI orchestration platforms and fast fashion throwaway b2b sass apps for our coporate overlords now. Lifestyle/small business software companies are basically going to go extinct long term. Just give Sam Altman money and GPT will make whatever you want and who cares if it's actually good or not because you'll just throw it away when you're done. Fast Fashion Software.
AI has taken everything I liked about developing software out of the equation and handed it over to a bot. Now I'm just doing the things that I find mostly annoying (code review, reviewing specs, triaging bugs) and not the things I actually enjoy - writing code and solving problems.
I guess this is what separates some people. But I always explicitly tell it to use only HTML/JS/CSS without any libraries that I've vetted myself. Generating code allows you now not having to deal with it a lot more.
Cool to hear nonetheless. Can we now also stop stigmatizing AI generated music and art? Looking at you Steam disclosures.
This is a revolution, welcome back to coding :)