Japanese electronics store pleads for old PCs amid ongoing hardware shortage
151 points
23 hours ago
| 9 comments
| tomshardware.com
| HN
left-struck
23 hours ago
[-]
I was in a Hard Off (Japanese used electronics store) just a week ago and found 10s of 8GB DDR4 ram sticks for around 1600 yen each (something like $10 USD). Some were ecc but others looked like ram modules from office pcs or something. It was in a rural area but still I would have thought they knew about the price hike. I guess not. Anyway I didn’t buy any so idk if they were working.
reply
walthamstow
23 hours ago
[-]
Interesting name, assume it's related to the bookstore Book Off?
reply
doall
21 hours ago
[-]
Some people get confused, but they are owned by different companies. The founders know each other, has some relationship in business, and the company behind Hard Off owns shares of the company behind Book Off.
reply
tecleandor
22 hours ago
[-]
Yep, it's a chain with a bunch of different brands.

Biggest ones are Book Off (books, comics...), Hard Off (electronics, computers, musical instruments...) and Hobby Off (toys, collectibles, video games...).

They even have a Liquor Off ! (not second hand, just discount/overstock)

reply
nereye
19 hours ago
[-]
couple more: Off House (household goods) Garage Off (car stuff)

Also: mode off (fashion).

See https://www.hardoff.co.jp/shop/brand/offhouse/

reply
fortran77
18 hours ago
[-]
They should open one for spicy Jamaican chicken
reply
jnaina
13 hours ago
[-]
Glad that they have not diversified into selling Hoisting Jacks.
reply
markoman
13 hours ago
[-]
well played!
reply
testplzignore
22 hours ago
[-]
Yeah, same company. "Off" likely meaning discounted.
reply
Brian_K_White
22 hours ago
[-]
Or like a bake-off?
reply
shen
22 hours ago
[-]
They are part of the same brand.
reply
m3kw9
19 hours ago
[-]
Hard On would have been better, I don't think they are aware
reply
wrxd
23 hours ago
[-]
I'm all for re-using old but still perfectly usable hardware. Hopefully this will also lead to some optimisations on the software side
reply
kasane_teto
23 hours ago
[-]
it probably wont
reply
nehal3m
23 hours ago
[-]
OpenAI buys all the RAM driving up the price so their models can shit out poorly optimized software so I have to buy more RAM. Awesome.
reply
embedding-shape
22 hours ago
[-]
Sure, except this time also add on US pressuring Korean companies to not sell their old equipment to Chinese manufacturers, so the supply could actually keep up. But no no, it's all OpenAI's fault for behaving like the capitalistic swines they are. Both suck, but one has a long-lasting impact, the other is just what capitalists always done, and will continue in the future.
reply
georgefrowny
18 hours ago
[-]
And then when China eventually produce not only RAM but the equipment to make it, it'll be shocked Pikachu all round.
reply
lysace
22 hours ago
[-]
Agreed - it's more likely to lead to poor user experience than anything else. Most end-user facing software/service companies will probably bet on the DRAM price peak being temporary.

Software takes a lot of time to build. Codebases live for decades. There's often an impossibly large cost in starting over with a less wasteful architecture/language/etc. Think going from an Electron/Chromium app to something built using some compiled language and native OS GUI constructs that uses 10x less resources.

reply
Workaccount2
22 hours ago
[-]
The impossibly large cost is the difference between hardware and software returns.

Hardware by nature forces redesigns whereas software it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so. That's why hardware is 10,000x faster than 30 years ago, and even simple word processors are debatabely faster than 30 years ago. Maybe even slower.

reply
immibis
22 hours ago
[-]
Hardware isn't much better actually. There isn't a good way I can show you this, but every x64 CPU contains an entire ARM CPU whose job is to initialize the x64 CPU. And of course it runs two operating systems - TrustZone and Minix.
reply
Krutonium
21 hours ago
[-]
It's even worse than that.

The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.

So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64.

reply
palmotea
21 hours ago
[-]
> The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.

Well, what if I want to run a 16-bit OS?

Also, I wonder if the transistor count of a literal entire 8086 processor is so small relative to the total that they just do that.

According to https://en.wikipedia.org/wiki/Transistor_count#Microprocesso...:

    1978 Intel 8086:         29,000

    2021 Rocket Lake: 6,000,000,000
So you could fit 200,000+ 8086s on that not-so-cutting-edge silicon.
reply
immibis
19 hours ago
[-]
You should use a 16-bit CPU then, or an emulator.

Compatibility mode doesn't work by having a separate 16-bit core. It's random bits of spaghetti logic to make the core work like a 16-bit core when the 32-bit flag isn't set.

reply
latentsea
21 hours ago
[-]
> So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64

First I'm learning about this and I'm curious why this needs to be the case? Seems so wild that it works this way, but I'm sure there's a logic to it.

reply
pwg
19 hours ago
[-]
Read up on the Intel Management Engine: https://en.wikipedia.org/wiki/Intel_Management_Engine

It began life as an "out of band" way to administer servers so that an ops. team could do everything (other than actual hardware changes) remotely that would otherwise need a person to be standing in front of the server in the datacenter poking commands into a keyboard.

It then grew in responsibilities to also support the "secure boot" aspect of system startup, and beyond some Intel CPU version point (I do not remember which point), it exists in every Intel CPU produced.

reply
immibis
21 hours ago
[-]
Because

> it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so

reply
ChoGGi
20 hours ago
[-]
Cheaper to keep it then to fix all the bugs that may come from removing it?
reply
lysace
21 hours ago
[-]
How much overhead (in terms of e.g. transistor count or silicon space) does this add? Surely at most it's a single digit percentage?
reply
immibis
19 hours ago
[-]
There isn't a separate 8086 core in every x64 core. The whole core has "if/then" spaghetti logic scattered throughout to alter its behaviour based on being in 16-bit mode.

At best, they might have been able to confine the needed logic patches to the instruction-decoding front end.

reply
UltraSane
19 hours ago
[-]
This is why Intel wants to remove the 16 and 32 bit modes and make 64 bit only CPUs.
reply
jerrysievert
13 hours ago
[-]
I remember being flabbergasted when I worked at the open source development lab and we got our first itanium system in, a multi-core, multi-rack nec system, with its own windows pc to boot up in order to get to linux.
reply
graemep
19 hours ago
[-]
I know IME runs on Minix but that is on a separate 32 bit x86 processor, AFAIK.

What does the ARM CPU do?

reply
serpent
20 hours ago
[-]
Really? That’s like a pony motor! :)
reply
Hamuko
22 hours ago
[-]
Has anyone noticed their employer actually cutting back on employee hardware purchases? Because if new laptops are still being handed over to developers on a 2–4 year cycle, then probably not.

Game developers might have to do something though if high-end GPUs are going to end up being $5000.

reply
wincy
22 hours ago
[-]
At least so far the RTX 5090 seems to be available and at the same price it’s been at for the past six months (around $3000). I’m not sure when you’d see GPUs affected by the RAM price increases.
reply
isk517
20 hours ago
[-]
I'm pretty sure GPUs have their own RAM installed that is usually higher spec than the motherboard RAM.
reply
mananaysiempre
20 hours ago
[-]
Higher in some respects (bandwidth), lower in others (latency, even though ordinary DDR5 is already no speed demon there and LPDDR5 is worse). At least from the spec sheet, these kinds of RAM are so different that I don’t really understand how demand for one can cause a shortage of the other, unless they are competing for the same manufacturing lines.
reply
tverbeure
19 hours ago
[-]
FWIW: GDDR is not higher latency than DDR. It just seems that way because the GDDR interface clock is much higher, so the number of clocks are higher too. But in terms of wall clock time, the latency is very similar.

Which makes sense: the latency is determined by the underlying storage technology and the way to access that storage, which the same for both.

reply
kube-system
18 hours ago
[-]
They are competing for the same manufacturing lines.

RAM manufacturers are switching lines over from DDR to make HBM.

reply
wolvoleo
15 hours ago
[-]
HBM works really great in GPUs too. In fact if I have to spend 5 grand on one it had better come with HBM lol.
reply
Hamuko
21 hours ago
[-]
reply
duffyjp
18 hours ago
[-]
Absolutely.

I've declined the refresh I've overdue for. My 2021 model MBP has 32gb and a 1TB SSD. They're currently giving out the base model Air: 16gb and 256gb. No thanks.

We used to get whatever was most appropriate for our role, now we get whatever is cheapest.

reply
UltraSane
19 hours ago
[-]
Sore, every single app will be a separate instance of Chrome.
reply
2OEH8eoCRo0
22 hours ago
[-]
So much software optimization, like rolling out new codecs that aren't accelerated on old but usable hardware.
reply
xnx
23 hours ago
[-]
The upside of high prices (GPU, RAM, disk, etc.) is existing resources get better utilized.
reply
2OEH8eoCRo0
23 hours ago
[-]
Hot take: it would be better for humanity if computers didn't get faster every year.
reply
seiferteric
22 hours ago
[-]
But energy efficiency is improving also.
reply
entropicdrifter
21 hours ago
[-]
Hardware energy efficiency is, but software has been getting less efficient for decades, at least on average.
reply
Aurornis
20 hours ago
[-]
Consumer PCs and laptops spend most of their time idle.

Idle power usage is what matters.

The 15 seconds it takes to launch Discord and install updates isn't going to be driving the overall efficiency of your PC.

reply
zootboy
19 hours ago
[-]
> Consumer PCs and laptops spend most of their time idle

Not when Windows gets its grubby mitts on them. I will frequently hear the fans spin up on my Win10 laptop when it should be doing nothing, only to find the Windows Telemetry process or the Search Indexer using an entire fucking CPU core.

reply
2OEH8eoCRo0
19 hours ago
[-]
In which case the screen likely uses the most power right?
reply
retSava
19 hours ago
[-]
It's like with cars - better performing drive trains (et al) is used to increase the power envelope instead of lowering fuel consumption, since that leads to more sales allegedly.
reply
wat10000
18 hours ago
[-]
It really isn't. I have a pocket-sized device that would utterly thrash a supercomputer from a couple of decades ago, and it goes a day or two on a 20Wh battery. Going full blast it'll consume maybe 25-30W, which is less than the idle power consumption of far less powerful devices from not all that long ago.

Incidentally, cars are also a lot more fuel efficient these days than they used to be.

reply
garbawarb
21 hours ago
[-]
Which makes it easier to consume more.
reply
rilindo
23 hours ago
[-]
Another hot take: maybe we will see a spike of compiles languages like Go, Rust and WASM, over Python, Ruby, and Node.
reply
999900000999
22 hours ago
[-]
To be fair, we could also just optimize the runtime engines for interpreted languages.

I do enjoy golang, but Rust gives me nightmares. I make my living in higher level languages.

When I started learning to program JavaScript was just starting to gain popularity outside of the browser. It was the first language I could actually grasp, and I largely think it for giving me a career.

No more evictions for me!

The only real downside to JavaScript, you know being used as a tool for native apps with stuff like electron is it eats ram. Everything needs to a ship a full chrome binary.

But if we go back to native applications, we don't get things like quality Linux ports. If you would have told me 15 years ago that Microsoft would create the most popular IDE on Linux I'd assume that you had misspoke.

reply
memoriuaysj
23 hours ago
[-]
we can use AI to rewrite everything in Rust

this way all the RAM that AI data centers scoop up will be used to lessen demand for RAM that those same datacenters created

net-zero RAM!

reply
supertrope
22 hours ago
[-]
Are you selling renewable memory offset credits? My company is seeking to burnish our ESG reputation.
reply
shellwizard
22 hours ago
[-]
You've got to [download more ram](https://downloadmoreram.com/)
reply
baxtr
23 hours ago
[-]
Yet another hot take: we won’t see any of that. Instead users will simply get used to waiting.
reply
kitsune1
22 hours ago
[-]
This.

Nodejs and Python were used in 2012, why is now any different?

reply
Dwedit
22 hours ago
[-]
I actually compared WASM to Javascript for a particular integer-math-heavy task. For a single run, Javascript beat out WASM because WASM had a lot more setup time. After running both 1000 times, they were almost equal in runtime.

Yes, even though the Javascript was written using Doubles and the WASM was written using 64 bit ints. It just means that it's possible to write optimized Javascript (mainly by reducing object allocations, reuse objects instead)

reply
eska
12 hours ago
[-]
Your mental model of integer vs double performance sounds outdated by decades. I’d suggest reading up on instruction performance on agnerfog, should be eye opening.
reply
testdelacc1
22 hours ago
[-]
A benchmark of adding numbers doesn’t tell you how it performs on real world websites and codebases. I wouldn’t be surprised if JavaScript was still very competitive, simply because of how good V8 is, but I don’t think we can conclude anything from your benchmark.

Of course it is always possible to write highly optimised code. But that’s not what people actually do, because of time, skill and maintenance constraints. Here’s a case study: in 2018 Mozilla ported some code from JS to Rust + WASM and got a 6x speed up [1]. An expert in V8 responded to this with highly optimised JavaScript, saying Maybe you don't need Rust and WASM to speed up your JS [2]. Both articles are worth reading! But it is worth remembering that it’s a lot quicker and easier to write the code in #1 than #2 and it is easier to maintain as well.

[1] - https://hacks.mozilla.org/2018/01/oxidizing-source-maps-with...

[2] - https://mrale.ph/blog/2018/02/03/maybe-you-dont-need-rust-to...

reply
Dwedit
9 hours ago
[-]
It wasn't some dummy "add numbers" loop, this was doing math (multiply-add) on large 336-bit integers.

Performance sucked when I used native Javacsript BigInts. When I made my own BigInt by using an array of doubles, and pretended that the doubles were 48-bit integers, performance was much better. Using the arrays meant that all allocation of temporary values completely stopped. I had to write my own multiply-and-add function that would do bigint = bigint * 48-bit number + other bigint + other 48-bit number.

reply
bluGill
22 hours ago
[-]
V8 means javascript can be fast. However no amount of optimization can get around inefficient code. There is only so much optimizes can do about too many layers of abstraction, calculations that are used but not needed, and nested loops. Someone needs to step back once in a while and fix bottlenecks to make things fast.
reply
CamperBob2
20 hours ago
[-]
Really? At what point should we have drawn the line?
reply
2OEH8eoCRo0
20 hours ago
[-]
I'm typing this on my primary PC with a twelve year old CPU.

Make the cutoff 2026. If you really need more cycles than we have today to solve a problem you're doing something wrong! Stop creating waste and forcing us to buy new trash all of the time.

reply
CamperBob2
20 hours ago
[-]
Sadly, I wouldn't be surprised if you get your wish. The consensus among those with power and influence seems to be that there is now too much computing power in the hands of the common people -- or at least too much RAM -- and it's time to bring the fire back up the mountain.
reply
numpad0
19 hours ago
[-]
That's where planned obsolescence comes into play!
reply
ToucanLoucan
20 hours ago
[-]
Gonna need a great big citation on that, superchief. I recently had to upgrade a friends' kiddos' PC because Discord simply could not function with a MERE 8GB of RAM.
reply
Aurornis
20 hours ago
[-]
> I recently had to upgrade a friends' kiddos' PC because Discord simply could not function with a MERE 8GB of RAM.

I have an old laptop with 8GB of RAM and an ancient CPU that I haul around with I when something small for basic work. I can run Discord, Visual Studio Code, and Chrome just fine.

Something else was going on with that PC.

Or the kid did an excellent job of socially engineering his parents into an upgrade.

reply
graemep
19 hours ago
[-]
8Gb is enough for a lot of things in my experience too. Either something wrong or Discord on top of a whole lot of other things was the problem.
reply
nobodyandproud
21 hours ago
[-]
I hung on to my 15 (?) year-old Intel motherboard, CPU, and 16 GBs of RAM; mostly because of e-waste guilt. I cannot believe this has value but here we are.

I also wish I built a new gaming rig during the summer last year.

reply
rwyinuse
18 hours ago
[-]
Almost any working PC has some value. I've sold nearly 20 year old Intel Core2Duo systems, old 1TB hard drives and lots of other old components for about 10 - 20 dollars each. My primary gaming PC is from 2011.
reply
yurishimo
19 hours ago
[-]
Prices will come back down. That’s the beauty of pc gaming.
reply
nirav72
10 hours ago
[-]
PC gaming is a small single digit percentage of the total volatile memory market. Neither Samsung or micron or Hynix currently have any incentives to increase production to address the shortage and lowering prices for that segment of the market. It’s just not a money maker for them.
reply
marak830
11 hours ago
[-]
I'm in the same boat, almost pulled the trigger 8 months ago to spend 4K on a powerful new system (with 128GB ram) .... boy am i kicking myself now.
reply
freetime2
20 hours ago
[-]
I have an old PC that I built that I’ve been meaning to take to Hard Off. I wonder if it would if it would be better to sell the parts individually, or as a single unit. For reference, it has a Ryzen 9 3900x, 64 GB DDR4, an RX 5700 XT graphics card, and 512 GB NVME. Probably not worth much anymore, and I’ll probably just sell it as-is to save myself the effort of taking it apart. But Hard Off occasionally surprises me with how much they pay for a piece of old gear.
reply
Aurornis
20 hours ago
[-]
When you said you had an old PC I didn't expect you to list parts that were built in 2019.

That's a perfectly fine modern PC. Similar to or faster than what some of my casual gaming friends use.

The RAM alone would probably sell for $350 or more.

reply
freetime2
18 hours ago
[-]
Yeah it’s still totally usable, but I just haven’t booted it up more than a handful of times in the past couple years. I built it for software development on Linux (with a little extra graphics capability for occasional gaming) in the era when Apple was making those terrible “touchbar” keyboards. But have since switched back to a MacBook (and a Switch 2 for gaming), and now it’s just taking up space.

Wow RAM prices have gotten absolutely insane.

reply
khedoros1
17 hours ago
[-]
Wow. Your old PC is more powerful than any computer I own. Faster version of my CPU, double the RAM, newer generation GPU. A fraction of the storage, though. I built it in 2020 to replace the desktop that I built in 2008.
reply
freetime2
17 hours ago
[-]
That's part of the reason I want to sell it - and hopefully get it into the hands of someone who can give it a second life before it becomes e-waste. I feel a bit guilty every time I think about it sitting and collecting dust.

The specs are still more than enough for any of the development I do. My main issue is just the form factor - a laptop is so much better for my current situation. The power consumption also kind of bugs me - things have improved a LOT in that regard since 2019 - although it's kind of nice for heating my office in winter. It also doesn't help that I built it with a full ATX motherboard and a giant case (Fractal Design Define R6), which is kind of ugly and takes up a ton of space.

reply
pamcake
17 hours ago
[-]
If you put a contact method in your HN profile, I think I have a buyer for you that would be happy.
reply
lazylizard
9 hours ago
[-]
why do they want old ram or hard disks or vga cards? lets say i got ddr2..what use is that unless u have a core2duo?
reply
ekianjo
21 hours ago
[-]
They want to buy it from you for peanuts, is what is missing from the article. Softmap buy your stuff at 1/10 of the actual original value and then sells it back to people at 5-6x what they bought it for.
reply
ktallett
22 hours ago
[-]
I have never found Japan the home of gaming pc's anyway. It isn't quite like Seoul in that respect. I have shopped in Akibahara frequently in the last decade and noticed some PC gaming exclusive shops pop up and also go, including range of stock varying.
reply
numpad0
20 hours ago
[-]
Akihabara was where runoffs and spillovers from cutting edge electronics accumulated until ~2010s. There used to be tons of failed experiments and EOL'd enterprise gears washing up. After that, supplies shifted to Taiwan and then to Shenzhen in China.

As far as gaming is concerned, the "gaming" parts of Akihabara mainly concerned locally produced pastel toned 2D slideshow pornographic games("visual novels"), the genre that lead to gacha games like FGO. Local populace is horrible at handling 3D first-person content in general and that never helped.

Non-console gaming in Japan is growing somewhat but a lot is also going into phones, namely Genshin. So where the trend is headed is still pastel toned soft-porn games without much PvP.

reply
ktallett
20 hours ago
[-]
Phone games seem to have been popular for a good decade. I find the extravagant advertistments in stations like Shinjuku are always rather impressive but I never quite get into any of those games like Genshin and Honkai.
reply
numpad0
19 hours ago
[-]
Those are social games. They don't make sense for someone not in a player community for a specific title on Twitter/Discord or in physical classrooms/breakrooms.
reply
ktallett
18 hours ago
[-]
That is a good point. I guess they are WoW esque but don't have the in app chat in the same way, and I am not in that community. Most of my Japanese friends are arcade types and I would go and play driving games with them like Initial D.
reply
crims0n
21 hours ago
[-]
I am not sure what is driving it, but PC gaming has had some explosive growth in Japan recently. I think in 2024 they were up to 15% of marketshare.
reply
numpad0
20 hours ago
[-]
VTubers. Console games are more complicated in terms of streaming forgiveness.
reply
jama211
21 hours ago
[-]
In my experience akihabara was never great for PC hardware but other places are in Tokyo are. Also Osaka had some good spots for pc hardware even in the stereotypical spots like den den town.
reply
ktallett
20 hours ago
[-]
Oh I always preferred den den on the few occasions I have visited Osaka. I find it better value and less glossy. Saying this I did used to love a used Thinkpad shop around the back of Sofmap in Akiba. Very good value. Captain PC or something if I remember rightly. It shut I think in 2021 which was sad.
reply
m3kw9
19 hours ago
[-]
a mac/macbook would be good in this environment
reply