Can a Computer Science Student Be Taught to Design Hardware?
25 points
1 hour ago
| 8 comments
| semiengineering.com
| HN
joezydeco
4 minutes ago
[-]
UIUC CS grad from the late 80s. CS students had to take a track of electrical engineering courses. Physics E&M, intro EE, digital circuits, microprocessor/ALU design, microprocessor interfacing.... It paid off immensely in my embedded development career.

I'm guessing this isn't part of most curricula anymore?

reply
wmichelin
2 minutes ago
[-]
I had to take computer architecture. We made a 4 bit CPU... or maybe it was 8 bit. I can't remember. But it was all in a software breadboard simulator thing. LogicWorks.
reply
Glyptodon
12 minutes ago
[-]
I have trouble believing there's a talent shortage in the chip industry. Lots of ECE grads I know never really found jobs and moved on to other things (including SWE). Others took major detours to eventually get jobs at places like Intel.
reply
KRAKRISMOTT
4 minutes ago
[-]
No shortage of talent. It's just that the big players are used to cheap almost minimum wage Taiwanese wages and refuse to pay the full price of an EE.
reply
bee_rider
18 minutes ago
[-]
Is the idea here that the code-generation apocalypse will leave us with a huge surplus of software folks? Enabling software people to go over to hardware seems to be putting the cart before the horse, otherwise.

Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).

reply
NoiseBert69
45 minutes ago
[-]
As a computer engineer I usually copy reference schematics and board layouts from datasheets the vendors offers. 95% of my hardware problems can be solved with it.

Learning KiCad took me a few evenings with YT videos (greetings to Phil!).

Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.

Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.

But as always: the better your gear gets - the more fun it becomes.

reply
throwup238
26 minutes ago
[-]
Even as a professional EE working on high speed digital and mixed signal designs (smartphones and motherboards), I used reference designs all the time, for almost every major part in a design. We had to rip up the schematics to fit our needs and follow manufacturer routing guidelines rather than copying the layout wholesale, but unless simulations told us otherwise we followed them religiously. When I started I was surprised how much of the industry is just doing the tedious work of footprint verification and PCB routing after copying existing designs and using calculators like the Saturn toolkit.

The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.

reply
stn8188
1 hour ago
[-]
The subheading to this article seems a little extreme: "To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened."

The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.

reply
em3rgent0rdr
9 minutes ago
[-]
> "CS majors could be taught to design hardware, and the EE curriculum"

"Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".

reply
mixmastamyk
32 minutes ago
[-]
Weird article, came to it hoping to see if I could train into a new job. But instead it went on and on about AI for almost the entire piece. Never learned what classes I might need to take or what the job prospects are.
reply
contubernio
50 minutes ago
[-]
Is this not what electrical engineers are for?
reply
dilawar
52 minutes ago
[-]
EE folks should design languages because they understand hardware better?!

And CS folks should design hardwares because they understand concurrency better?!

reply
lawstkawz
4 minutes ago
[-]
Working in EE post BSc in EE from 99-06, it's pretty much CS + I know how to bread board and solder if absolutely necessary.

A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.

Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).

Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.

reply
rbanffy
17 minutes ago
[-]
I know you said it in jest, but there is a strong justification for cross-feeding the two disciplines - on one side, we might get hardware that’s easier to program and, on the other end, we might get software that’s better tuned to the hardware it runs on.
reply
IshKebab
43 minutes ago
[-]
Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.

The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.

If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.

reply
elektronika
32 minutes ago
[-]
> The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.

Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.

reply
acuozzo
37 minutes ago
[-]
> is really just because hardware design is a niche field

Which doesn't pay as well as jobs in software do, unfortunately.

reply
general1465
1 minute ago
[-]
Exactly money is problem. I am by trade hardware designer. I have no problem to sit down, create PCB in KiCAD and have it made perfect on first try. But I am doing this just as a hobby because it does not pay much. SWE just pays better even with the AI scarecrow behind it.
reply
IshKebab
35 minutes ago
[-]
Really? In my experience in the UK it pays ~20% better. We're talking about silicon hardware design. Not PCBs.
reply
IshKebab
36 minutes ago
[-]
In fact I'll go further - in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices. Many hardware designers are happy to hack whatever together with duck tape and glue. As a result most of the hardware industry is decades behind the software industry in many ways, e.g. still relying on hacky Perl and TCL scripts to cobble things together.

The notable exceptions are:

* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).

* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.

reply