A16Z partner says that the theory that we'll vibe code everything is ' wrong'
20 points
1 hour ago
| 7 comments
| aol.com
| HN
7777777phil
29 minutes ago
[-]
Even a16z is walking this back now. I wrote about why the “vibe code everything” thesis doesn’t hold up in two recent pieces:

(1) https://philippdubach.com/posts/the-saaspocalypse-paradox/

(2) https://philippdubach.com/posts/the-impossible-backhand/

Acharya’s framing is different from mine (he’s talking book on software stocks) but the conclusion is the same: the “innovation bazooka” pointed at rebuilding payroll is a bad allocation of resources. Benedict Evans called me out on LinkedIn for this take, which I take as a sign the argument is landing..

reply
atomic128
29 minutes ago
[-]
Sounds like a16z has some rapidly depreciating software equity they want to sell you.

Or maybe they own the debt.

Listen to some of the Marc Andreessen interviews promoting cryptocurrency in 2021.

Do that and you will never listen to him or his associates again.

reply
tombert
21 minutes ago
[-]
I dunno.

I really hate the expression "the new normal", because it sort of smuggles in the assumption that there exists such thing as "normal". It always felt like one of those truisms that people say to exploit emotions like "in these trying times" or "no one wants to work anymore".

But I really do think that vibe coding is the "new normal". These tools are already extremely useful, to a point where I don't really think we'll be able to go back. These tools are getting good enough that it's getting to a point where it's getting to where you have to use them. This might sound like I'm supportive of this, and I guess am to some extent, but I find it to be exceedingly disappointing because writing software isn't fun anymore.

One of my most upvoted comments on HN talks about how I don't enjoy programming, but instead I enjoy problem solving. This was written before I was aware of vibe coding stuff, and I think I was wrong. I guess I actually did enjoy the process of writing the code, instead of just delegating my work to a virtual intern while I just watch the AI do the fun stuff.

A very small part of me is kind of hoping that once AI has to be priced at "not losing money on every call" levels that I'll be forced to actually think about this stuff again.

reply
syndacks
16 minutes ago
[-]
I largely agree with you. And, given your points about “not going back” — how do you propose interviewing SWEs?
reply
tombert
6 minutes ago
[-]
I have thought about this a lot, and I have no idea. I work for an "AI-first" company, and we're kind of required to use AI stuff as often as we can, so I make very liberal use of Codex, but I've been shielded from the interview process thus far.

I think I would still kind of ask the same questions, though maybe a bit more conceptual. Like, for example, I might see if I could get someone to explain how to build something, and then ask them about data structures that might be useful (e.g. removing a lock by making an append-only structure). I find that Codex will generally generate something that "works" but without an understanding data structures and algorithms, its implementation will still be somewhat sub-optimal, meaning that understanding the fundamentals has value, at least for now.

reply
j45
15 minutes ago
[-]
Just because we can code something faster or cheaper doesn't increase the odds it will be right.
reply
duzer65657
37 minutes ago
[-]
>> Anish Acharya says it is not worth it to use AI-assisted coding for all business functions. AI should focus on core business development, not rebuilding enterprise software.

I don't even know what this means, but my take: we should stop listening to VCs (especially those like A16Z) who have an obvious vested interest that doesn't match the rest of society. Granting these people an audience is totally unwarranted; nobody but other tech bros said "we will vibe code everything" in the first place. Best case scenario: they all go to the same exclusive conference, get the branded conference technical vest and that's were the asteroid hits.

reply
sanction8
38 minutes ago
[-]
a16z talking again?

This is your regular reminder that

1) a16z is one the largest backers of LLMs

2) They named one of the two authors of the Fascist Manifesto their patron saint

3) AI systems are built to function in ways that degrade and are likely to destroy our crucial civic institutions. (Quoted from Professor Woodrow Hartzog "How AI Destroys Institutions"). Or to put it another way, being plausible but slightly wrong and un-auditable—at scale—is the killer feature of LLMs and this combination of properties makes it an essentially fascist technology meaning it is well suited to centralizing authority, eliminating checks on that authority and advancing an anti-science agenda (quoted from the A plausible, scalable and slightly wrong black box: why large language models are a fascist technology that cannot be redeemed post).

reply
arjie
1 minute ago
[-]
I will not claim to be an expert historian but one general belief I have is that nomenclature undergoes semantic migration over a century. So for the sake of conciseness I will quote the first demand of each portion of the Fascist Manifesto. This isn't to obscure, because it is in Wikipedia[0] and translated in English on EN Wikipedia[1], but so I can share a sample of whether this is something we can relate to our present day political orientation. Hopefully it will inform what you believe "author of the Fascist Manifesto" to imply:

> ... > > For this WE WANT: > > On the political problem: > > Universal suffrage by regional list voting, with proportional representation, voting and eligibility for women. > > ... > > On the social problem: > > WE WANT: > > The prompt enactment of a state law enshrining the legal eight-hour workday for all jobs. > > ... > > On the military issue: > > WE WANT: > > The establishment of a national militia with brief educational services and exclusively defensive duty. > > ... > > On the financial problem: > > WE WANT: > > A strong extraordinary tax on capital of a progressive nature, having the form of true PARTIAL EXPROPRIATION of all wealth. > > ...

0: https://it.wikipedia.org/wiki/Programma_di_San_Sepolcro#Test...

1: https://en.wikipedia.org/wiki/Fascist_Manifesto#Text

reply
alephnerd
29 minutes ago
[-]
Both AI Fanatics and AI Luddites need to touch grass.

We work in Software ENGINEERING. Engineering is all about what tools makes sense to solve a specific problem. In some cases, AI tools do show immediate business value (eg. TTS for SDR) and in other cases this is less obvious.

This is all the more reason why learning about AI/ML fundamentals is critical in the same way understanding computer architecture, systems programming, algorithms, and design principles are critical to being a SWE.

Given the number of throwaway accounts that commented, it clearly struck a nerve.

reply