I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
Personally, at work I use the latest hardware at home I use ewaste.
The RAM for that is basically ewaste at this point, yet it runs the workloads it needs to do just fine.
On the other hand, seeing >1TiB RAM in htop always makes my day happier.
---
So the Dell Precision T7920 runs dual Intel Scalable (Skylake) and has oodles of DIMM slots (24!), but you'll need to use a PCIe adapter to run an NVMe drive. FlexBays give you hot-swappable SATA, SAS too but only if you're lucky enough to find a system with an HBA (or add one yourself). But if you manage to salvage 24x 64GB DDR4 DIMMs, you'll have a system with a terabyte-and-a-half of ECC RAM - just expect to deal with a very long initial POST and a lot of blink codes when you encounter bad sticks. The power supply is proprietary, but can be swapped from the outside.
The T7820 is the single-CPU version, and has only 6 DIMM slots. But it is more amenable to gaming (one NUMA domain), and I have gifted a couple to friends.
If you're feeling cheap and are okay with the previous generation, the Haswell/Broadwell-based T7910 is also serviceable - but expect to rename the UEFI image to boot Linux from NVMe, and it's much less power efficient if you don't pick an E5 v4 revision CPU. I used a fully-loaded T7910 as a BYOD workstation at a previous job, worked great as a test environment.
Lenovo ThinkStation P920 Tower has fewer DIMM slots (16) than the T7920, but has on-motherboard m.2 NVMe connectors and three full 5.25" bays. I loaded one with Linux Mint for my mother's business, she runs the last non-cloud version QuickBooks in a beefy network-isolated Windows VM and it works great for that. Another friend runs one of these with Proxmox as a homelab-in-a-box.
The HP Z6 G4 is also a thing, though I personally haven't played with one yet. I do use a salvaged HP Z440 workstation with a modest 256GB RAM (don't forget the memory cooler!) and a 3090 as my ersatz kitchen table AI server.
The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.
Also, it is frightening how close that is to current day minimum wage.
Livelihood
The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.
Holy moly! 11 whole dollars an hour!?
Okay, so we went from $4.25 to $11.00. That's a 159% change. Awesome!
Now, lets look at... School, perhaps? So I can maybe skill-up out of Little Caesars and start building a slightly more comfortable life.
Median in-state tuition in 1995: $2,681. Median in-state tuation in 2025: $11,610. Wait a second! That's a 333% change. Uh oh.
Should we do the same calculation with housing...? Sure, I love making myself more depressed. 1995: $114,600. 2025: $522,200. 356% change. Fuck.
However, we are living through a housing supply crisis, and while overall cost of living hasn't gone up, housing's share of that has massively multiplied. We would all be living much richer lives if we could bring down the cost of housing — or at least have it flatline, and let inflation take care of the rest.
Education is interesting, since most people don't actually pay the list price. The list price has gone up a lot, but the percentage of people paying list price has similarly gone down a lot: from over 50% in the 90s for state schools to 26% today, thanks to a large increase in subsidy programs (student aid). While real education costs have still gone up somewhat, they've gone up much less than the prices you're quoting lead you to believe: those are essentially a tax on the rich who don't qualify for student aid. [2]
But I agree that tackling housing alone would be significant.
There are a zillion examples like this. Housing has gone way up adjusted for inflation, but many other things have gone way, way down adjusted for inflation. I think it's hard to make a case that overall cost of living has gone up faster than median wages, and the federal reports indicate the opposite: median real income has been going up steadily for decades.
Housing cost is visible and (of course, since it's gone up so much) painful. But real median income is not underwater relative to the 90s. And there's always outrage when something costs more than it used to, even if that's actually cheaper adjusted for inflation: for example, the constant outrage about videogame prices, which have in fact massively declined despite requiring massively more labor to make and sell.
The easy way to realize this is to notice that the median wage has increased by proportionally less than the federal minimum wage has. The people in the middle can't afford school or housing either. And what happens if you increase the minimum wage faster than overall wages? Costs go up even more, and so does unemployment when small businesses who are also paying those high real estate costs now also have to pay a higher minimum wage. You're basically requesting the annihilation of the middle class.
Whereas you make housing cost less and that helps the people at the bottom and the people in the middle.
I'm not really resorting to any solution.
My comment is pointing out that when you only do one side of the equation (income) without considering the other side (expenses), it's worthless. Especially when you are trying to make a comparison across years.
How we go about fixing the problem, if we ever do, is another conversation. But my original comment doesn't attempt to suggest any solution, especially not one that "requests the annihilation of the middle class". It's solely to point out that adventured's comment is a bunch of meaningless numbers.
The point of that comment was to point out that minimum wage is irrelevant because basically nobody makes that anyway; even the entry-level jobs pay more than the federal minimum wage.
In that context, arguing that the higher-than-minimum wages people are actually getting still aren't sufficient implies an argument that the minimum wage should be higher than that. And people could read it that way even if it's not what you intended.
So what I'm pointing out is that that's the wrong solution and doing that rather than addressing the real issue (high costs) is the thing that destroys the middle class.
It can also imply that expenses should come down, you just picked the implication you want to argue against.
On the housing side, the root problem is obvious:
Real estate cannot be both affordable and considered an investment. If it's affordable, that means the price is staying flat relative to inflation, which makes it a poor investment. If it's a good investment, that means the value is rising faster than inflation, which means unaffordability is inevitable.
The solution to the housing crisis is simple: Build more. But NIMBYs and complex owners who see their house/complex as an investment will fight tooth-and-nail against any additional supply since it could reduce their value.
in the mid 90s you could open a CD (certificate of deposit at a bank or credit union) and get 9% or more APY. savings accounts had ~4% interest.
in the mid 90s a gallon of gasoline in Los Angeles county was $0.899 in the summer and less than that any other time. It's closer to $4.50 now.
As long as you don't run into anything unforseen like medical expenses, car breakdowns, etc., you can almost afford a bare-bones, mediocre life with no retirement savings.
That being said, there's been an enormous push by various business groups to do everything they can to keep wages low.
It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...
I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.
Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.
And, in any case, the solution could also be (totally, or in part) a reduction in expenses instead of increase in income.
>It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...
That's for sure! I know it's not getting solved on the hacker news comment section, at least.
If you're focusing on minimum wage, they tent to be highly coupled, though some jurisdictions have lower minimum wages for minors to deal with this.
> Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.
Few minimum wage jobs even offer the option to work full time. Many retail environments have notoriously unpredictable shifts that are almost impossible for workers to plan around. I've heard varying reasons for this (companies like having more employees working fewer hours for flexibility down to avoiding people on the full time payroll means they legally don't have to offer benefits). The result is that minimum wage earners often have to juggle multiple jobs, childcare, and the negative effects of commuting to all of them.
This also ignores many other factors around poverty, such as housing costs and other inflation.
> That's for sure! I know it's not getting solved on the hacker news comment section, at least.
For sure! 99% of people on HN haven't had to experience living long term off of it. I did for awhile in college, where outside of tuition I had to pay my own way in a large city (I fully acknowledge that this is anecdotal and NOT the same as poverty living). I only had to feed myself, not think about saving for the future, and I was sharing a house with other geeky roommates where we had some of the best times of our lives. I don't think we could have pulled that off in today's economic environment...
I don't like my uncle at all and find him and people like him to be terrible human beings.
Let's consider the implications of this. We take an existing successful business, change absolutely nothing about it, but separately and for unrelated reasons the local population increases and the government prohibits the construction of new housing.
Now real estate is more scarce and the business has to pay higher rent, so they're making even less than before and there is nothing there for them to increase wages with. Meanwhile the wages they were paying before are now "not a living wage" because housing costs went way up.
Is it this business who is morally culpable for this result, or the zoning board?
All that being said, though, Robert Heinlein said once:
> There has grown up in the minds of certain groups in this country the notion that because a man or corporation has made a profit out of the public for a number of years, the government and the courts are charged with the duty of guaranteeing such profit in the future, even in the face of changing circumstances and contrary to the public interest. This strange doctrine is not supported by statute or common law. Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.
I find it rich how many tech people are working for money losing companies, using technology from money losing companies and/or trying to start a money losing company and get funding from a VC.
Every job is not meant to support a single person living on their own raising a family.
Oh and the average franchise owner is not getting rich. They are making $100K a year to $150K a year depending on how many franchises they own.
Also tech companies can afford to pay a tech worker more money because you don’t have to increase the number of workers when you get more customers.
YC is not going to give the aspiring fast food owner $250K to start their business like they are going to give “pets.ai - AI for dog walkers”
I find it slightly hypocritical that people can clutch their pearls at small businesses who risk their own money while yet another BS “AI” company’s founders can play founder using other people’s money.
A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances... and if it does, the owner has a strong incentive to automate it away.
The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.
Anyone working for someone else, doing literally anything for 40 hours a week, should be entitled to enough compensation to support themselves at a minimum. Any employer offering less than that is either a failed business that should die off and make room for one that's better managed or a corporation that is just using public taxpayer money to subsidize their private labor expenses.
Turns out our supply of underage workers is neither infinite, nor even sufficient to staff all fast food jobs in the nation
If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.
Wow, a completely bad-faith argument.
Can you try again, but this time, try "steelman" instead of "strawman"?
Pentium 60/66s were in the same price tier as expensive alpha or sparc workstations.
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).
A powerbook 5300 was $6500 in 1995, which is $13,853 today.
The TCO was much higher, considering how terrible and flimsy this laptop was. The power plug would break if you looked at it funny and the hinge was stiff and brittle. I know that’s not the point you are making but I am still bitter about that computer.
Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.
After the M1, my casual home laptop started outperforming my top-spec work laptops.
But not if you cared about battery life, because that was the tradeoff Apple was making. Which worked great until about 2015-2016. The parts they were using were not Intel’s priority and it went south basically after Broadwell, IIRC. I also suppose that Apple stopped investing heavily into a dead-end platform while they were working on the M1 generation some time before it was announced.
It's like time travelling back to 2004. Slow, loud fans, random brief freezes of the whole system, a shell that still feels like a toy, a proprietary 170W power supply and mediocre battery life, subpar display. The keyboard is okay, at least. What a joke.
Meanwhile, my personal M3 Max system can render Da Vinci timelines with complex Fusion compositions in real time and handle whole stacks of VSTs in a DAW. Compared to the Lenovo choking on an IDE.
Expensive PCs are also crap. My work offers Macbooks or Windows laptops (currently, Dell, but formerly Lenovo and/or HP), and these machines are all decidedly not 'cheap' PCs. Often retailing in excess of $2k.
All my coworkers who own Windows laptops do is bellyache about random issues, poor battery life, and sluggish performance.
I used to have a Windows PC for work about 3 years ago as well, and it was also a piece of crap. Battery would decide to 'die' at 50% capacity. After replacement, 90 minute battery life off charger. Fan would decide to run constantly if you did anything even moderately intensive such as a Zoom meeting.
Example:
> time du -sh .
737G .
________________________
Executed in 24.63 secs
And on my laptop that has a gen3, lower spec NVMe: > time du -sh .
304G .
________________________
Executed in 80.86 secs
It's almost 10 times faster. The CPU must have something to do with it too but they're both Ryzen 9. $ time du -sh .
935G .
real 0m1.154s
Simply because there's less than 20 directories and the files are large.My new setup: gen5 ssd in desktop:
> time find . -type f | wc -l
5645741
________________________
Executed in 4.77 secs
My old setup, gen3 ssd in laptop: > time find . -type f | wc -l
2944648
________________________
Executed in 27.53 secs
Both are running pretty much non-stop, very slowly.If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.
I swapped out old ASUS MBs for an i3-540 and an Athlon II X4 with brand new motherboards.
They are quite cheaper than getting a new kit, so I guess that's the market they cater to: people who don't need an upgrade but their MBs gave in.
You can get these for US$20-US$30.
So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.
I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!
I only ever noticed it on my windows partition. IIRC on my linux partition it was hardly noticeable because Linux is far better at caching disk contents than windows and also linux in general can boot surprisingly fast even on HDDs if you only install modules you actually need so that the autoconfiguration doesn't waste time probing dozens of modules in search of the best one.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.
if you want to save even more money get the older Arc Battlemage GPUs. I used one it was comparable with an RTX 3060; i returned it because the machine i was running it in had a bug that was fixed 2 days before i returned it but i didn't know that.
I was seriously considering getting a b580 or waiting until the b*70 came out with more memory, although at this point i doubt it will be very affordable considering VRAM prices going up as well. A friend is supposedly going to ship me a few GTX 1080ti cards so i can delay buying newer cards for a bit.
One of my brothers has a PC I built for him, specced out with an Intel Core i5 13400f CPU and an Intel Arc A770 GPU, and it still works great for his needs in 2025.
Surely, Battlemage is more efficient and more compatible in some ways over Alchemist. But if you keep your expectations in check, it will do just fine in many scenarios. Just avoid any games using Unreal Engine 5.
- A UEFI DXE driver to enable Resizable BAR on systems which don't support it officially. This provides performance benefits and is even required for Intel Arc GPUs to function optimally.
List of working motherboards
Instant buy $700 or under. Probably buy up to $850. At, like, $1,100, though… solid no. And I’m counting on that thing to take the power-hog giant older Windows PC tower so bulky it’s unplugged and in a closet half the time, out of my house.
Ofc there's also the alternate strategy of going for a mid/high end rig and hoping it lasts a decade, but the current DDR5 prices make me depressed so yeah maybe not.
I genuinely hope that at some point the market will get flooded with good components with a lot of longevity and reasonable prices again in the next gens: like AM4 CPUs, like that RX 580, or GTX 1080 Ti but I fear that Nvidia has learnt their lesson in releasing stuff that pushes you in the direction of incremental upgrades rather than making something really good for the time, same with Intel's LGA1851 being basically dead on arrival, after the reviews started rolling in (who knows, maybe at least mobos and Core Ultra chips will eventually be cheap as old stock). On the other hand, at least something like the Arc B580 GPUs were a step in the right direction - competent and not horribly overpriced (at least when it came to MSRP, unfortunately the merchants were scumbags and often ignored it).
That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?
G.Skill, lifetime warranty: https://www.gskill.com/warranty
Corsair, lifetime warranty: https://help.corsair.com/hc/en-us/articles/360033067832-Warr...
Kingston, lifetime warranty: https://www.kingston.com/en/company/warranty
Teamgroup, lifetime warranty: https://support.teamgroupinc.com/en/support/warranty.php
Molecular dynamics simulations, and related structural bio tasks.
I use Vulkan for graphics, but Vulkan compute is a mess.
I'm not in a mindshare, and this isn't a political thing. I am just trying to get the job done, and have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.
> have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.
I’m saying this is a mindshare thing if you haven’t evaluated ROCm / HIP. HIPify can convert CUDA source to HIP automatically and HIP is very similar syntax to CUDA.
$700 in 2014 is now $971 inflation adjusted (BLS calculator).
RTX 3060 12gb $180 (eBay). Sub $200 CPU (~5-7 times faster than yours). 16gb DDR4 $100-$120. $90 PSU. $100 motherboard. WD Black 1tb SSD $120. Roughly $800 (which inflation adjusted beats your $700).
Right now is a rather amazing time for CPUs, even though RAM prices have gone crazy.
Assume you find some deals somewhere in there, you could do slightly better with either pricing or components.
The last one were I really remember seeing a huge speed bump was going from a regular SSD to a NVMe M.2 PCIe SSD... Around 2015 I bought one of the very first consumer motherboard with a NVMe M.2 slot and put a Samsung 950 Pro in it: that was quite something (now I was upgrading the entire machine, not just the SSD, so there's that too). Before that I don't remember when I switched from SATA HDD to SATA SSD.
I'm now running one of those WD SN850X Black NVMe SSD but my good old trusty, now ten years old, Samsung 950 Pro is still kicking (in the wife's PC). There's likely even better out there and they're easy to find: they're still reasonably priced.
As for my 2015 Core i7-6700K: it's happily running Proxmox and Docker (but not always on).
Even consumer parts are exceptionally reliable: the last two failures I remember, in 15 years (and I've got lots of machines running), are a desktop PSU (replaced by a Be Quiet! one), a no-name NVMe SSD and a laptop's battery.
Oh and my MacBook Air M1's screen died overnight for no reason after precisely 13 months, when I had a warranty of 12 months, (some refer to it as the "bendgate") but that's because first gen MacBook Air M1 were indescribable pieces of fragile shit. I think Apple got their act together and came up with better screens in later models.
Don't worry too much: PCs are quite reliable things. And used parts for your PC from 2014 wouldn't be expensive on eBay anyway. You're not forced to upgrade to a last gen PC with DDR5 (atm 3x overpriced) and a 5090 GPU.
By far the fastest computer I’ve ever used. It felt like the SSD leap of years earlier.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.
It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?
Nobody is paying for that time.
And whilst it is 'training', my training time is better spent elsewhere than battling with why cuda won't work on my GPU upgrade.
Therefore, I avoid hardware and software changes merely because a tiny bit more speed isn't worth the hours I'll put in.
Depending on salary, 2 magnitudes at $5k is $500k.
That amount of money for the vast majority of humans across the planet is unfathomable.
No one is worried about if the top 5% can afford DRAM. Literally zero people.
>I buy the best phones and desktops money can buy
Sick man! Awesome, you spend 1/3 of the median US salary on a laptop and desktop every year. That's super fucking cool! Love that for you.
Anyways, please go brag somewhere else. You're rich, you shouldn't need extra validation from an online forum.
Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.
Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.
Prices are high but they're not that high, unless you're buying the really big GPUs.
The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.
It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.
Thats in Germany, from a corporate supplier.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
And many in the first world haha
The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.
That’s exactly my point. Underspending on your tools is a misallocation of resources.
The goal is the right tool for the job, not the best tool you can afford.
If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.
DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.
A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.
you're just building a gaming rig with a flimsy work-related justification.
Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.
Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.
So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.
Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.
From the perspective of an individual, ROI has to be large to justify a $5k/yr investment. HOWEVER, the general principle of "if something is your livelihood, then you should be willing to invest in it as appropriate" is an excellent thing to keep in mind. Moreover, at the scale of a company and typical company decisions the advice makes a ton of sense -- if a $1k monitor and $2k laptop allow your employees to context-switch better or something then you should almost certainly invest in that hardware (contrasted with the employee's view of ROI, the investments are tax-deductible and just have to pay off in absolute value, plus they don't have the delay/interaction with wages/promotions/etc introducing uncertainty and loss into the calculation) (the difference between a few hundred dollars and a few thousand dollars in total capital investment probably does have a huge difference in outcomes for a lot of computer-based employee roles).
It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.
With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.
That and your employer should pay for your hardware in many cases.
======== COMPUTER ========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
======== PHONE ========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
======== ASIDE ========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
Proper calculation is: cost/ performance ratio. Then buy a second from the list:)
For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.
Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.
You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.
You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
Big Tech has made it unaffordable for everyone.
No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?
In the 80s I ran an early SMTP / POP email client that functioned as a DOS TSR with code and data less than 64k. Granted, it was pretty crappy and was text-only (non MIME.) But there's got to be a middle ground between 64k for a craptastic text-only email client and a 1Gb OS / Browser / Webmail combo that could probably run that DOS TSR in an emulator as an attachment to a short email.
You know what you choose for a frontend? It's Electron + React and React Native.
And none of your customers will complain because the people with money to spend are rocking 12GB of RAM on their phone.
I mean, for goodness sake, an empty YouTube page with no videos eats up a shocking amount of memory - 90 MB just for the js heap. I used to run Windows 3.1 on a machine with 8 MB of RAM.
Admittedly, a good amount of memory used with browsers is because of larger graphics buffers that go along with higher resolution monitors, but much is just... waste.
AI doesn't deserve it more than we do, but also we shouldn't be required to have $300 in RAM for basic functionality. We shouldn't have to deal with RAM scalpers because businesses don't want to develop good software.
Instead, we the users are forced to pay for more and more memory and CPU and disk because some rich asshole doesn't want to spend the money on developing good software. The costs are pushed to us. And since resources are now unimaginably expensive, it's still our problem and we still have to foot the bill a million times over.
And on top of that bug fixes and efficiency gains are never a top priority, only new features and redesign are pushed forward.
Bizarre how some commentors have rose colored glasses, like AI isn't exponentially bloat.
Wat?
Seriously though... I think most uses of LLMs are pretty stupid, but it seems like we're in the bubble and the only way people can continue to make money is by doubling down on AI spending. Or at least that's the only way they think they can make money.
So... sorry for leaving you with that impression. Maybe the only way to get to the post AI hype world is to give AI companies everything they want so they fail faster.
"Yes, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders!" ( see https://economicsociology.org/2014/10/07/yes-the-planet-got-... )
Does anyone deserve RAM though?
It makes absolutely no sense to apply the lessons from one into the other.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.
The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.
That has absolutely no similarity to how Samsung is organized.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.
Or Toyota using a Subaru engine (Scion FRS, Toyota GT86)
Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.
They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.
Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.
That's hilarious, which phone is this?
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
[1]: https://www.androidauthority.com/samsung-exynos-versus-snapd...
International models tended to use Samsung's Exynos processors, while the ones for the North American market used Snapdragons or whatever.
For the last 10+ years apples iPhones have shipped with about half the ram of a flagship android for example.
RAM pricing segmentation makes Apple a lot of money, but I think they scared themselves when AI took off and they had millions of 4GB and 8GB products out in the world. The Mac minimum RAM specs have gone up too, they're trying to get out of the hole they dug.
code:data, by and large I bet that content held in ram takes up the majority of space. And people have complained about the lack of ram in iPhones for ages now, particularly with how it affects browsers.
In the past this has resulted in stuff like Samsung Display sending their best displays to Apple instead of Samsung Mobile.
https://www.washingtonpost.com/business/2019/02/28/why-ameri...
(This is not a comment making any judgements about cost or the state of the economy, I was just surprised to find it that high)
If you want, you can add in a 16GB laptop every 36 months, tripling the total to 0.75GB and ~$10 a month. Still, that's multiple times less than the increase in egg price compared to the average consumption.
A typical pattern might be to have two eggs for breakfast (a whopping 120 calories), boiled eggs for lunch/snack (another 60-120 calories), and of course baking, but I will pretend that people don’t bake.
A more typical serving for an adult breakfast might be 3 eggs if not supplemented.
For mom and dad and the little one, you’re now at 35 (2+2+1+2)x5 eggs per week. When your cost goes from $6 (2x18 @3) to $16 (2x18@8) per week, you notice.
Obviously the political discourse around this was not healthy. But eggs suddenly becoming a cost you have to notice is a big deal, and a symbol for all of the other grocery prices that went up simultaneously.
If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.
The peak was actually closer to $8/dozen, my math has been conservative at every step, the situation is worse than I describe.
"If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.".
Skill issue. Oatmeal is very cheap and filling. The aforementioned yogurt. Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast. A piece of fruit like the perennial classic banana for breakfast. Complaining about egg prices comes from the camp of "I tried nothing and nothing worked".
Paying more for staples that you've eaten your whole life (especially in a boiled frog way) is much more time/energy/mentally cheaper than experimenting with how you and your kids might like a bowl of oatmeal prepared.
That said, if you're having trouble making ends meet and you have kids, you don't have much of a choice.
I would have hoped that better access to nutrition information would have led to parents making better choices. Absolutely insane that they’re still choosing desserts for breakfast every day instead of high quality Whole Foods like eggs.
I think I see why you think eggs went up to $8 a dozen now
Eggs are one of the highest protein-per-calorie, nutrient dense foods you can purchase. Up until recently it was cheaper than almost any other staple. When I was growing up (admittedly during a time everything was relatively cheap) my family ate a lot of eggs. We had spreads, we had eggs for breakfast, and eggs were incorporated into dinners in one way or another. I'm not the only one. I don't know anyone born in my cohort that didn't eat eggs regularly.
> Oatmeal is very cheap and filling
Also completely devoid of the same level of nutrition as eggs and requires supplementation.
> it's majority cereal or breakfast bars.
While true this is an education issue not a cost issue. We still have at least 3 generations of people having children that were raised in the "eggs are horrible for you" times, including myself.
> Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast.
The average person is obese because of the relative ease of cheap, high calorie, fillers and good options being more expensive. The price of eggs increasing compounds this. However, I would wager most adults are obese because of the high calorie starbucks, fast food, and snacks. Not because of cereal for breakfast.
> A piece of fruit like the perennial classic banana for breakfast.
Demonstrably worse for you than both cereal and eggs. Once again, defeating your point and STILL demonstrating more expensive eggs makes nutritionally worse options the only option.
Obviously the net effect was to silo all the different departments so that nobody really knew how the entire product worked, except the few smokers who'd regularly go outside and smoke for 15+ minutes and chat to whoever else was around.
96GB (2x48) DDR5 5x00 £260 today £1050
128GB (4x32 ) DDR5 5x00 £350 today £1500
Wut?
Edit: formatting
https://store.minisforum.com/products/minisforum-motherboard
Can't find it for sale, though. There's also a barebones mini-PC:
https://www.asrock.com/nettop/AMD/DeskMini%20X600%20Series/i...
So much for open markets, somebody must check their books and manufacturing schedules.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
Let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply to jack up the prices on purpose.
For example: https://chipsandwafers.substack.com/p/mainstream-recovery
"Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"
(from june of this year).
The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.
They’ve been acting like a cartel for a long time now and somehow they never match the demand even after 18 months straight price increases. They already have the fab, the procedures, and everything, so stop acting like they’re setting up a brand new fab just to increase throughput.
Demand right now is so high that they'd make more net profit if they could make more dram. They could still be charging insane prices. They're literally shutting down consumer sales - that's completely lost profit.
I have a family member who works in a field related to memory and storage fabrication. At the moment Micron, etc, are running these money printers full time and forgoing routine maintenance to keep the money flowing.
The fact that they’re busy doesn’t hide the fact that they’re known to collude before, and they might even ship parts to phony resellers to keep the price high.
What’s next? A commodity memory chip is going to cost more than a cpu or gpu die?
You want to tame their cartel like behaviors? Just get into their books and it would be clear as day if they’re artificially constraining the supply, and I’m not even talking about spending extra billions.
You cannot manufacture something that modern life depends on and not get government scrutiny.
You can ramp up production in limited capacity, make long term contracts, or pass the manufacturing rust to the buyer. When we needed a vertical stabilizer for a legacy aircraft we paid for an entire production like to be built just to manufacture two tails, so there are tons of ways to do this if you want to be competitive. But instead this is a cartel like market where manufacturers colluded before, so they’re more likely to collude than spend billions doing anything.
Just open their books and schedules with a competent auditors and see if they’re artificially manipulating things or not.
It's a ridiculous situation and these companies, whoever they are, should be somewhat ashamed of themselves for the situation they're putting us in.
That goes specially for those MF at OpenAI who apparently grabbed 40% of the worldwide DRAM production, as well as those sold in stores.
I wonder how this will impact phone prices.
I'll be honest, I have 0 confidence that this is a transient event. Once the AI hype cools off, Nvidia will just come up with something else that suddenly needs all their highest end products. Tech companies will all hype it up, and suddenly hardware will be expensive again.
The hardware manufacturers and chip designers have gotten a taste of inflated prices and they are NOT going to let it go. Do not expect a 'return to normal'
Even if demand goes back to exactly what it what, expect prices to for some reason be >30% higher than before for no reason - or as they would call it 'market conditions'.
Pricing.
Next up: Nvidia exits the consumer hardware space and shifts fully to datacenter chips.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
And regardless, you could flip it around and ask, what will we do in x years when the next shortage comes along and we have no fabs? (And that shortage of course could well be an imposed one from an unfriendly nation.)
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
AI is not going away, but there will be a correction and things will plateau to a new higher level of demand for chips and go back to normal as always. There's too much money involved for this not to scale up.
Markets can't adapt overnight to tons of data centers being built all of a sudden but it will adapt.
What will they do when people continue to not pay for this crap and investors demand their pound of flesh? Because uh, nobody's paying for this, and when people are gambling with trillions of dollars...
>Markets can't adapt overnight to tons of data centers being built all of a sudden but it will adapt.
Which data centers?
I pay for 3 different AI products and every person on my team is paying for at least one. Just because some enterprise sales teams rushed to oversell some half-baked AI products they duct taped together doesn't mean there isn't a huge market.
> Which data centers?
Microsoft https://blogs.microsoft.com/blog/2025/09/18/inside-the-world... Anthropic https://www.datacenterknowledge.com/data-center-construction... Twitter https://www.datacenterdynamics.com/en/news/elon-musks-twitte... CoreSite https://www.coresite.com/news/coresite-launches-ny3-data-cen...
Meta, Google, and Oracle are scaling theirs up too
A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.
I want my money back, OpenAI!
I think we're going to regret this.
The 96GB kit I bought (which was more than I needed) was $165. I ended up buying another 96GB kit in June when I saw the price went up to $180 to max out my machine, even though I didn't really need it, but I was concerned where prices were going.
That same kit was $600 a month ago, and is $930 today. The entire rest of the computer didn't cost that much
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
Sometimes that materializes.
Here the narrative is almost the opposite: pay for our expensive infrastructure and we’ll take all your jobs.
It’s a bit mind boggling. One wonders how many friends our SV AI barons will have at the end of the day.
I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.
My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.
"The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933
And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.