Chips for the Rest of Us
55 points
15 hours ago
| 7 comments
| engineering.nyu.edu
| HN
dmbche
13 hours ago
[-]
" chip design cycle is also probably one of the most complicated engineering processes that exists in the world,” said Institute Professor Siddharth Garg (ECE). “There’s a saying that rocket science is hard, but chip design is harder.” "

Why not Rockets for the rest of us first, if that's easier?

reply
gsf_emergency_6
9 hours ago
[-]
Rocket science is hard because you have to burn your own cash if you are not connected. Chip design is less risky for the individual, but it's been harder (so far) to signal your mastery to the funders.

The difficulty is not (entirely) technical

reply
wlesieutre
12 hours ago
[-]
There’s already lots of rockets for the rest of us, they’re just not as big
reply
__rito__
8 hours ago
[-]
Link to the BASICS course mentioned: https://engineering.nyu.edu/academics/programs/digital-learn...

Link to the Zero to ASIC course that they are collaborating with: https://www.zerotoasiccourse.com/digital/

I wish for free alternatives to these.

reply
gsf_emergency_6
10 hours ago
[-]
uni PR wasn't bad faith, just bad placement. source is here

https://github.com/shailja-thakur/VGen

Earlier from the NYU (2023)

https://zenodo.org/records/7953725

Related (?) blog post (2023)

https://01001000.xyz/2023-12-21-ChatGPT-AI-Silicon/

reply
stogot
10 hours ago
[-]
> To address this challenge, Garg and colleagues scoured Verilog code on GitHub and excerpted content from 70 Verilog textbooks to amass the largest AI training dataset of Verilog ever assembled. The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

I expect this will become the norm in a number of fields. Perhaps COBOL is next?

reply
fancy_pantser
5 hours ago
[-]
> The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

Perhaps it's the first open one. I was an eng manager at a hyperscaler helping one of our clients, a large semiconductor design company, build models to use internally. It was trained on their extensive Verilog repos, tooling, and strict style guides. I see this being repeated across industries, at least since 2023 there are quite a few deep-pocketed S&P 500 orgs creating models from scratch or extensively finetuning to give unique advantages they require. They're rarely announced specifically, but you can often infer from the initial investment or partnership announcements that they're working on it.

reply
alecco
15 hours ago
[-]
> Consequently, the NYU researchers’ goal is to make chip design more accessible, so nonengineers, whatever their background can create their own custom-made chips.

What?

reply
Aloisius
11 hours ago
[-]
"ChatGPT: Please design a chip for me."

Basically.

reply
kingstnap
2 hours ago
[-]
Unironically what industry is trying to do. Saw a slide with basically exactly that written on it sometime ago at a conference (MLCAD).
reply
bigyabai
14 hours ago
[-]
I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

A lot of my questions went away when I got to this line though:

> He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.

This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.

reply
charlie-83
13 hours ago
[-]
Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable
reply
tormeh
12 hours ago
[-]
All I remember from my experience with VHDL/Verilog is that they really truly suck.
reply
bsder
13 hours ago
[-]
> I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).

The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.

Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.

reply
tormeh
12 hours ago
[-]
Would you really earn more money doing this than monopolizing online search advertising? Because I find that hard to believe. Hardware seems like a miserable business.
reply
pesfandiar
11 hours ago
[-]
That might change if geopolitical tensions fragment the global supply chains.
reply
bsder
7 hours ago
[-]
Being a fab is a garbage business.

Being a software supplier to fabless semiconductor companies is a very profitable business.

In the Gold Rush, the people who came out rich were selling the shovels and denim.

reply
fleshmonad
14 hours ago
[-]
We have textual slop, visual slop, audio slop, so we asked: "What else do we want to sloppify?". And then it dawned on me. ICs. ICs haven't been slopped yet — sure, we could ask the machine to generate some vhdl, but that isn't the same. So we present: Silicon Slop.

I am actually astonished. Is this what happens when the NYU board of directors tells every department they have to use and create AI, or they will stop funding? What is going on?

reply
carlCarlCarlCar
13 hours ago
[-]
Ah, thanks; we definitely needed more artisanal, real human social media slop like this.

Improving the lived experience keeping it real! Feels so much more authentic.

More people would love AI if it communicated like an emo *Nix elitist. Train it on Daria, Eeyore, and grunge lyrics! People will love it!

reply
bgwalter
13 hours ago
[-]
Bootstrap framework for chips, Verilog stolen from books and from GitHub.
reply