The beginning of programming as we'll know it?
57 points
by zdw
1 day ago
| 12 comments
| bitsplitting.org
| HN
adamredwoods
2 hours ago
[-]
>> The computers will come for all of our jobs eventually, but those of us who refuse or decline to embrace the most powerful creative tools we’ve ever been given will be the first to fall.

It's being mandated by almost all companies. We're forced to use it, whether it produces good results or not. I can change a one-line code faster than Claude Code can, as long as I understand the code. Someday, I'll lose understanding of the code, because I didn't write it. What am I embracing?

reply
uduni
1 hour ago
[-]
It's a skill set just like coding. You can embrace an elevated workflow where you can forget about the specific syntax and focus on the architecture and integration. It takes time to intuit what exactly the models are bad at, so you can forsee hallucinations and prevent them from happening in the first place. Yes you can write 1 line faster than Claude, but what about 10 lines? 100? 1000?
reply
photios
1 hour ago
[-]
> Someday, I'll lose understanding of the code, because I didn't write it.

I've been wading through vast corporate codebases I never wrote and yet had to understand for the past 20 years. This isn't any different, and AI tools help with that understanding. A lot!

The tools and techniques are out there.

reply
theonething
1 hour ago
[-]
> I'll lose understanding of the code, because I didn't write it.

What about code that other (human) engineers write? Do you not understand that code too because you didn't write it?

reply
galaxyLogic
1 hour ago
[-]
Intersting observation. There is a difference however. Pre-AI each human programmer understood the code they wrote, in general. So there were many humans who understood some part of the code. Poast-AI there will be no humans who understand any code, presumably. Sure we will understand its syntax, but the overall architecture of applications may be so complicated that no human can understand it, in practice.
reply
martin-t
1 hour ago
[-]
Have you read Coding machines[0]?

BTW, that guy received an Oscar for coding. Oh, far far have we fallen since those days and how far we have yet to go...

[0]: https://www.teamten.com/lawrence/writings/coding-machines/

reply
martin-t
1 hour ago
[-]
You're not embracing it, you're forced to accept it because the nature of employer-employee relationships has a fundamental power differential which makes it exploitative.

You helped build the company, you should own a proportional part of it.

The issue with the current system is that only the people who provide money, not the people who provide work, get to own the result. Work (and natural resources) is where value comes from. Their money came from work as well but not only their work, they were in a position of power which allowed them to get a larger cut than deserved. Ownership should, by law, be distributed according to the amount and skill level of work.

Then people wouldn't worry about losing their jobs to automation - because they'd keep receiving dividends from the value of their previous work.

If their previous work allowed the company to buy a robot to replace them, great, they now get the revenue from the robot's work while being free to pursue other things in their now free time.

If their previous work allowed an LLM to be trained or rented to replace them, great, they get get the revenue from the LLM's work...

reply
auggierose
22 minutes ago
[-]
What about the people who didn't work for the one company that became the only company?

You still don't make sense, mate.

Yes, a better system would be great. Half-baked ideas only stand in its way.

reply
hyperhello
5 hours ago
[-]
My feeling is that AI is not real coding; it is coding-adjacent. Project Management, Sales, Marketing, Writing Books About KanBan, AI Programming, User Interface Design, Installing Routers are coding-adjacent. AI is not real coding any more than The Sims is homemaking. You can use AI and hang with the tech guys and get your check but you are going to be treading water and trying to be liked personally to stay where you are. No question it's a job, but no, it's not coding.
reply
mikkupikku
5 hours ago
[-]
My thinking is that high level languages like C aren't real coding. If you don't even know what ISA the software will be run on, then you need to get the fuck off my lawn!

Attitude as old as time itself.

reply
Chinjut
2 hours ago
[-]
Yes, "An LLM is just a new higher level programming language", sure. A new programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.

(Yes, one can write C programs with undefined behavior, but C does also have many well-defined properties which allow people to reason about well-defined C programs logically reliably.)

reply
bitwize
2 hours ago
[-]
(And the behavior of any given C implementation is completely defined.)
reply
hyperhello
5 hours ago
[-]
You mock, but not very persuasively. You seem to be relying on a silly idea you don't even believe in: that someone, once, made fun of C programming.
reply
napsec
4 hours ago
[-]
I can't speak personally to what it was like to be a C developer in the early days of that language, but when I started out as a Ruby on Rails developer over a decade ago I was definitely told by some people that it didn't count as 'real programing' because of how much was abstracted away by the framework.
reply
SpaceNoodled
4 hours ago
[-]
I have some bad news for you
reply
operatingthetan
4 hours ago
[-]
>AI is not real coding any more than The Sims is homemaking.

Your analogy is bad. The programmer and the AI both produce working code. The other poster's response was correct.

reply
bigstrat2003
1 hour ago
[-]
The AI does not, in fact, produce working code.
reply
mikkupikku
4 hours ago
[-]
Mocking? I'm quoting exactly the sort of thing that used to be said in earnest in the 80s and 90s. What you're doing now is exactly the same thing, there's no difference at all. Its the same reaction borne from the same old man instinct to bitch about the kids going soft. Yawn.
reply
skydhash
4 hours ago
[-]
Both Algol and Lisp were from the 60s. I think programmers and computers scientists were already acquainted with high level programming languages enough to not equate using C as going soft.

Also software was always about domain knowledge and formal reasoning. Coding is just notation. Someone may like paper and pen, and someone may like a typewriter, but ultimately it’s the writing that matters. Correctness of a program does not depends on the language (and the cpu only manipulate electric flow).

I argue against AI because most of its users don’t care about the correctness of their code. They just want to produce lots of it (the resurgence of the flawed LoC metric as a badge of honor).

reply
signatoremo
3 hours ago
[-]
> I argue against AI because most of its users don’t care about the correctness of their code.

This is remarkably sloppy for someone who codes. No facts, just opinion, claimed with confidence.

reply
skydhash
3 hours ago
[-]
> No facts, just opinion, claimed with confidence

Strong opinions, loosely held.

What I’ve seen seems to confirm that opinion, so I’m still holding on to it.

reply
dtagames
2 hours ago
[-]
Right? Some of us used to read hex digits off printed paper dumps to debug mainframe memory (like me), but we can be excited about AI and embrace it, too.

From my perspective, knowing how it gets down to machine code makes it more useful and easier to control, but that doesn't mean I want to stop writing English now that we can.

reply
qsera
3 hours ago
[-]
I think it should be called dice-coding, not vibe-coding. You roll the LLM dice, and sometimes it comes with the right looking program on the top.

Since the dice is loaded heavily, this happens quite often. This makes people think that the dice can program.

reply
an0malous
3 hours ago
[-]
> If you interpret these examples to mean that any person can write down any list of requirements along with any user interface specs, and the AI will consistently produce a satisfactory product, then I’d agree programmers are toast.

I think the road to this is pretty clear now. It’s all about building the harness now such that the AI can write something and get feedback from type checks, automated tests, runtime errors, logs, and other observability tools. The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.

reply
kami23
2 hours ago
[-]
I am also of this opinion that a lot of this can be solved in time with a harness. And whole heartedly agree that there is a class of webapp that has been trivialized that can make a mom and pop shop up to 'enterprise' (80% of our architecture seems to center around the same pattern at my $DAYJOB) run just fine if they accept some of the vibes.

This type of works seems to be happening as a lot of orchestrator projects that pop up here every once in a while, and I've just been waiting for one with a pipeline 'language' malleable enough to work for me that I can then make generic enough for a big class of solutions. I feel doomed to make my own, but I feel like I should do my due diligence.

reply
skydhash
1 hour ago
[-]
> The majority of software is fairly standardized UI forms running CRUD operations against some backend or data store and interacting with APIs.

Have you ever look at Debian’s package list?

Most CRUD apps are just a step above forms and reports. But there’s a lot of specific processing that requires careful design. The whole system may be tricky to get right as well. But CRUD UI coding was never an issue.

DDD and various other architecture books and talks are not about CRUD.

reply
phyzix5761
3 hours ago
[-]
> any person can write down any list of requirements along with any user interface specs

Isn't this just a new programming language? A higher level language which will require new experts to know how to get the best results out of it? I've seen non-technical people struggle with AI generated code because they don't understand all the little nuances that go into building a simple web app.

reply
Chinjut
2 hours ago
[-]
A programming language with no tractable guarantees about the behavior of any particular program, including in practice no guarantees that the same source code (in this new programming language) will reliably produce the same behavior. This is very different from traditional programming languages and how we can reason about them.
reply
ma2kx
4 hours ago
[-]
In chess, engines have long been stronger than humans, but for a long time a (super) grandmaster with an engine was still better than an engine alone.
reply
NooneAtAll3
2 hours ago
[-]
> a (super) grandmaster with an engine was still better than an engine alone.

any examples? I'd love to watch how that went

reply
theteapot
3 hours ago
[-]
Roughly 20y DeepBlue to AlphaZero. I don't think that is comparable though. Use of deep neural networks was what made the machines starting with AlphaZero dominant again. I.e. we're already in the new paradigm.
reply
martin-t
7 minutes ago
[-]
> Speaking of goodness, I share the majority opinion that AI is generally good

I would like to know how the author concluded that this is the majority opinion.

reply
markus_zhang
2 hours ago
[-]
For anyone who relies on their knowledge of business, of taking requirements, know that eventually your customers will be at least as good as you on this skill.

After all they are asking questions. And they are not dumb ass who don’t learn. They are also motivated to learn to adapt to AI.

IMO the best value of humans right now is to provide skills, as fuel of the future. Once we burn up then the new age shall come.

reply
galaxyLogic
1 hour ago
[-]
I think customers will say: "I don't want to try to come up with (correct) requirements. I rather hire this SW firm that specializes in that skill.

What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially.

In the early days of computers, "System Analyst" was a very cool job-description. I think it will make a comeback with AI.

reply
skydhash
1 hour ago
[-]
> What is often overlooked is that we are not trying to just produce programs, we are trying to produce "systems", which means systems where computers and humans interact beneficially

People do overlook that. Software is written to solve a problem. And it interacts with other software and data from the real world. Everything is in flux. Which is why you need both technical expertise (how things work) and domain knowledge (the purpose of things) to react properly to changes.

Creating such systems is costly (even in the AI edge), which is why businesses delegate parts that are not their core business to someone else.

reply
satisfice
4 hours ago
[-]
‘There is a confirmation bias at work here: every developer who has experienced such a remarkable outcome is delighted to share it. It helps to contribute to a mass (human) hallucination that computers really are capable of anything, and really are taking over the world.”

This is survivorship bias, a form of sample bias.

Confirmation bias is a form of motivated reasoning where you search for evidence that confirms your existing beliefs.

reply
11217mackem
1 hour ago
[-]
logic, rigor, judgement and taste
reply
fraywing
4 hours ago
[-]
I'm observing that there is some kind of status quo bias nearly uniformly being surfaced by the programming community right now.

I myself have feelings like this, as a software engineer by trade.

"We will forever be useful!" As a sounding cry against radical transformation. I hope that's the case, but some of these pieces just seem like copium.

reply
georgemcbay
1 hour ago
[-]
> "We will forever be useful!" As a sounding cry against radical transformation. I hope that's the case, but some of these pieces just seem like copium.

Yeah no shade to the author, it was a well written piece, but these arguments increasingly seem like a form of self-soothing to me. On current trajectories I don't see how AI won't swallow up the majority of the software development field in ways that completely devalue not just software engineers, but everyone up the stack in software-focused companies, eventually removing the need for most of the companies to even exist.

(Why have a software company employing someone who is an expert at gathering customer requirements to feed to the LLM when the customer can just solve their own problem with an LLM? They'll build the wrong thing the first time... and the fifth time... and maybe the tenth time... but it'll still be way faster for them to iterate their own solution than to interface with another human).

It'll take longer for senior engineers to suffer than juniors, and longer for system architects to suffer than senior engineers, and so on up the chain but I'm just not seeing this magical demarcation line that a lot of people seem to think is going to materialize prior to the rising tide of AI starting to go above their own neck.

And this is likely to all happen very quickly, it was less than a year ago when LLMs were awful at producing code.

reply
trimethylpurine
2 hours ago
[-]
Good read.
reply
julianlam
5 hours ago
[-]
> Just a few years ago, AI essentially could not program at all. In the future, a given AI instance may “program better” than any single human in history. But for now, real programmers will always win.

For how long? Do I get to feel smug about this for 10 days, 10 weeks, or 10 years? That radically changes the planned trajectory of my life.

reply
operatingthetan
4 hours ago
[-]
These posts are just programmers trying to understand their new place in the hierarchy. I'm in the same place and get it, but also truisms like 'will always win' is basically just throwing a wild guess at what the future will look like. A better attitude is to attempt to catch the wave.
reply
TacticalCoder
4 hours ago
[-]
TFA's author is literally saying it may happen. He's using AI so he already caught the wave. He's augmenting himself with AI tools. He's not saying "AI will never surpass humans at writing programs". He writes:

" At this particular moment, human developers are especially valuable, because of the transitional period we’re living through."

You and GP are both attacking him on a strawman: it's not clear why.

We're seeing countless AI slop and the enshittification and lower uptime for services day after day.

To anyone using these tools seriously on a daily basis it's totally obvious there are, TODAY*, shortcomings.

TFA doesn't talk about tomorrow. It talks about today.

reply
mikkupikku
4 hours ago
[-]
To be fair, the author phrased his point poorly in a way that invites confusion:

> "But for now, real programmers will always win."

"for now ... always", not a good phrasing.

reply