> Z-Day + 15Yrs
> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.
If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?
> Z-Day + 30Yrs
> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.
You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.
> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.
Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first transistor was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).
Storage. You only need a few hundred working systems to keep a backbone alive. Electron migration doesn’t kill transistors if they are off and in a closet.
> You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.
You don’t need to make new drives; there are already millions of DVD/Bluray devices available. The small microcontrollers on optical drives are on wide node sizes, which also make them more resilient to degradation.
> they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).
If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.
This kind of just breaks the thought experiment, because without the "why?" of this being vaguely answered, it makes no sense. How do you game out a thought experiment that starts with an assumption that humanity just randomly stops being humanity in this one particular way? What other weird assumptions are we meant to make?
Let's assume we go back to the pre-transistor era—1946 and earlier, the world then was a very different place but it was still very modern.
It's too involved to list in detail but just take a look at what was achieved during WWII. The organization and manufacturing was truly phenomenonal. Aircraft production alone during the War was over 800,000 aircraft, manufacturing aircraft at that rate has never been equalled since, same with ships.
We developed huge amount of new tech during the War including the remarkably complex atomic bomb and much, much more.
And we did all this without the transistor, integrated circuit, CPUs, internet and even smartphones!
Now consider the planning and organizational difficulties of D-Day—probably the most complex and logistically difficult understanding ever—without the aid of modern communications, the internet and smartphones, etc.—all of which depend on CPUs. Right, that happened too, and it was a total success.
I wonder how a generation brought up during the post-silicon era would cope if all that were no longer available. It could happen if we had another Carrington Event or one that's even bigger (which has occurred), or say with nuclear EMP events.
WWII Aircraft production https://en.m.wikipedia.org/wiki/World_War_II_aircraft_produc...
WWII Military production: https://en.m.wikipedia.org/wiki/Military_production_during_W...
It ain't ever going to happen because people can write these things called books. And computer organization and architecture books already exist and there are many 10k's copies of them. What should be captured in modern computer organization books is applied science aspects of the history until now and the tricks that made Apple's ARM series so excellent. The other thing is TSMC needs to document fab process engineering. Without the capture of niche, essential knowledge they become strategic single points of failure. Leadership and logic dictate not allowing this kind of vulnerability to fester too deeply or too long.
If you turn off any manufacturing line, your company forgets really quickly how to make what that line made. GE discovered this when they tried to restart a water heater line in Appliance Park.
Remington apparently has no idea what the blueing formula was they used in their original 1911s.
Colt lost the ability to handfit revolvers.
There could be https://global.canon/en/technology/nil-2023.html &
https://spectrum.ieee.org/nanoimprint-lithography
There also is https://www.searchforthenext.com &
claiming to not need that ASML high-end stuff at all, to be competitive.
As reported around 2019 amongst many others like here:
https://eepower.com/new-industry-products/search-for-the-nex...
Maybe it's vaporware, because I'm unaware of anything 'big' produced there. Maybe it's only lack of funding,lack of trust because non-standard and 'unproven', inertia? Who knows?
And finally the forgotten minimal.fab by Yokogawa https://www.yokogawa.com/industries/semiconductor/minimal-fa...
https://www.minimalfab.com/en/ with no outrageous claims about structure size equivalence, but way faster turn-around times for prototyping, and none of the usually necessary investment in all that clean-room tech.
And not to forget the push and incentive China got as 'development help' to be independent :-)
I'm sure they're up to many interesting things in the near future.
Remember the first 32-bit cpus were manufactured on >1um processes. Never mind 8-bitters or KB-sized memory chips.
Also note that IC designs, assembly programming & more, can be (and has been) done by hand. Having any kind of compute, no matter how slow by today's standards (couple MHz) helps a lot. Same for basic applications like text processing, spreadsheets, small databases, software development, etc etc.
Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.
Or do you mean the circumstances that would lead to this (nuclear war perhaps) would make us toast
Civilization is a continuity of discrete points of time.
We were able to enter (so-called) Dark Ages where things were forgotten (e.g., concrete) and still continue because things were often not very 'advanced': with the decline of Rome there were other stories of knowledge, and with the Black Death society hasn't much beyond blacksmithing and so was able keep those basic skills.
But we're beyond that.
First off, modern society is highly dependent on low-cost energy, and this was kicked off by the Industrial Revolution and easy accessible coal. Coal is much depleted (often needing deeper mines). Then next phase was with oil, and many of the easy deposits have been used up (it used to bubble up to the ground in the US).
So depending on how bad any collapse is, getting things up without easily accessible fossil fuels may be more of challenge.
That's an Anglo-Saxon black legend. How do you think boats and trebuchets were made? Navigation in the ocean without trig? Astrolabes? Yeah, sure. Year 600 wasn't the same as 1200.
Read about Alphonse X. https://en.m.wikipedia.org/wiki/Alfonso_X_of_Castile
As long as the algorithmic complexity of food logistics is O(n) or better with respect to population size, I guess.
We did that during a period of peculiar circumstances that won't ever be replicated. Relatively large, distributed population with many different ecological environments that we were already pre-adapted to. A far smaller single-point-failure population that can't just go out and hunt for its food among the vast wildlife might have it pretty rough if industrial civilization were to falter.
Scary how high up this tightrope is.
But tbh I don't see it as at all likely short of something like nuclear war that would be the much bigger problem.
I think you also should realize much of the world continues on without bleeding-edge technology - homes are still built, crops are still harvested, and the world goes on.
I don't think it'd be the end of life as we know it.
It’s the ones in factories, power systems, and transportation equipment, among other things.
We've had this happen before of course. There's a ton of things ancient civilizations were doing that we are clueless about. So clueless, that one of the leading theories is that they must have been aided by aliens.
I'm talking about people all over the globe, separated by time, I don't know what your deal is with acting like I'm a white person poo'ing on POC - or how any of the racial/nationality/etc stuff you wrote factors in at all. You're obviously easily triggered and/or need to work on reading comprehension
It always was a matter of commerce and knowledge sharing between distinct races and tribes.
Would be a pretty solid intermediate step to bootstrap automation and expansion in the cases where the supply of the "best" fabs is removed (like in a disaster, or the framework to support that level of manufacturing isn't available, such as your colony example)
The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.
Lots of important chips everywhere that aren’t CPUs.
Generators are just big coils of copper. Substations too. Solar won't work without silicon, but anything with a spinning coil of copper would. Voltmeters would need replacing with the old analog versions and humans would need to manually push switches to keep power levels constant, just like in the '50s.
Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).
There's enough information on machine tools and the working of iron to make all the tooling and machinery required to start an assembly line somewhere.
After all, there was an assembly workshop turning out the Antikythera mechanism, there was a user guide on it. Obviously it wasn't the only one produced at the time.
It is not obvious at all to me. Where are the others like it found?
You can replace known likely culprits preemptively, assuming you can get parts. But dendritic growths aren’t yet a problem for most old stuff because the feature sizes are still large enough. No one really knows what the lifetime of modern 5/4/3nm chips is going to be.
Really depends on brand and purpose but consumer hardware switches do die pretty frequently.
But if you bought something like a C2960 fanless switch I would expect it to outlive me.
So, no.
I work in a medical lab. The company bought a new automated coagulation analyzer. The old machine was shut down (proper shut down procedure) and kept in storage in case it was needed. They should have replaced the wash fluid with water. This procedure isn't documented because nobody expects that kind of machine to just sit unused for a long time. After a few months we needed to start it again (can't remeber why, I think there was a delivery problem with the reagents for the new analyzer). We couldn't. The washing fluid dried and detergents and other chemicals it contained solidified inside the valves, just like it happens with an inkjet printer if left unused. They were all stuck. Repair would have been too expensive and it was scrapped.
I saw this happen with a haematology analyzer too. It was kept as a backup but wasn't need for a few months. I was able to resurrect that one after several hours of repeated washing.
An electrolyte analyzer is even worse. Keep it turned off for only a few hours and the electrodes will need to be replaced.
I don't think any other advanced industrial machine is any different. They can't be just left unused for a while and then expect them to work. It's even more problematic if the shut down procedure isn't done right (exceeding the documented requirements) or not at all.
My daily driver laptop is a 2012 Thinkpad I literally pulled out of a scrap heap at my local university but it refuses to die. Moore's law has slowed enough that old hardware is slow but still perfectly capable to run 99% of existing software.
Already existing machines would give us at least one or two decades to restart manufacturing from zero and that is more than enough time to avoid existential problems.
And most computers are under-utilized. The average gaming PC is powerful enough to run the infrastructure for a bunch of small companies if put to work as a server.
Its so easy to think of them as lasting forever
Something would need to happen to stop / prevent production for about 30 - 60 years.
Thats roughly equivalent to the Saturn V engine, and Codename FOGBANK which are the 2 examples of technologies that had to be reverse engineered after the fact.
Hypothetically we might choose to stop making new ones if demand dried up significantly.
It could be the case that we finally hit a sold wall in CPU progress, cloud providers demand something they dont have to replace every few years, and the result is some kind of everlasting gobstopper CPU.
Then as failures fall off, so does demand, and then follows production.
A pretty large drop in global population might see the same result. Labor needs to be apportioned to basic needs before manufacturing.
And because they go into things like dishwashers and cars (and missiles) and stuff that dies for other reasons than chip failure, you always need some supply of them.
Though I guess if we end all wars and make stuff so good that you literally never need a new widget ever and all industry just stops, then I suppose there is such a thing as a perfect design.
we already had a sci-fi story, where humanity forgot all beatles songs https://www.youtube.com/watch?v=-JlxuQ7tPgQ
Could they eventually replicate a CMOS technology? No one doubts this, but the latest lith process took how many years to develop, and only one company makes those machines anywhere in the world? Nearly microscopic molten tin droplets being flattened mid-air so that it can radiate a particular wavelength of UV?
That's not something they'll have up and running again in 6 months, and if it were lost, regression to other technologies would be difficult or impossible too. We might have to start from scratch, so to speak.
Vacuum tubes are still made. They’re used extensively in instrument amplification.
But I think this bolsters your point!
If there were some kind of interdiction on silicon (an evil genie or some kind of Butlerian Jihad perhaps?), the market would remember and/or rediscover the thermoelectric effect and throw money/postapocalyptic bartered goods at glassblowers pretty sharpish.
If that status continued, I'm sure we'd see developments in that space in terms of miniaturisation, robustness, efficiency, performance, etc., that would seem as improbable to us as a modern CPU would seem to someone in the no-silicon timeline. You may never get to "most of a teraflops in your pocket, runs all day on 4000mAh and costs three figures" but you could still do a meaningfully large amount of computation with valves.
Savant-tier, obsessive, dedicates his life to it "guy" does it in his garage over a period of how many years, and has succeeded to what point yet? Has he managed even a single little 8-bit or even 4-bit cpu? I'm cheering that guy on, you know, but he's hardly cranking out the next-gen GPUs.
>the market would remember
Markets don't remember squat. The market might try to re-discover, but this shit's path dependent. Re-discovery isn't guaranteed, and it's even less likely when a civilization that is desperate to have previously-manufacturable technology can't afford to dump trillions of dollars of research into it because it's also a poor civilization due to its inability to manufacture these things.
As I said, you probably won't ever get to where we are now with the technology, but then again probably 99.999% of computing power is wasted on gimmicks and inefficiency. Probably more these days. You could certainly run a vaguely modern society on only electromechanical and thermionic gear - you have power switching with things like thyrotrons, obviously radios, and there were computers made that way, such as the Harwell Witch in 1952.
Maybe you don't get 4K AI video generation or petabyte-scale advertising analytics but you could have quite a lot.
For reference, the original 4004 Intel CPU from 1971 ran at 740 kHz, so 52 kHz isn't even enough computing to do a secure TLS web connection without an excessively long wait. The 4004 did not do floating point, however, and it wouldn't be until between the 486 (1989) and the Pentium (1993) that we see 5-10 MFLOPS of performance.
Hmm... I think 9800X should be able to do at least 32 FLOPS per cycle per core. So 1.3 TFLOPS is the ceiling for the CPU. 1/100000 leaves you... 12 MFLOPS.
Then there's the iGPU for even more FLOPS.
Many things we now take for granted would indeed be impossible. I suppose the good news is that in some electropunk timeline where everyone had to use tubes, your TLS connection might not be practical, but the NSA datacentre would be even less practical. On the other hand, there'd be huge pressure on efficiency in code and hardware use. Just before transistorisation, amazing things were done with tubes or electromechanically, and if that had been at the forefront of research for the last 70 years, who knows what the state of the at would look like. Strowger switches would look like Duplo.
Probably there would still be a lot of physical paperwork, though!
Comparisons to old technology is just something I do for fun, don't read too much into it. :)
Fun fact: A usb-C to HDMI dongle for has more computing power than the computer that took us to the moon.
As far as the NSA being even less practical, they're among the few who have the staff that could eke every last cycle of performance out of what remained. Maybe the Utah datacenter wouldn't work, but Room 641A long predates that.
Another view on this topic is https://gwern.net/slowing-moores-law
The few industries that push computing out of need would suffer. Certain kinds of research, 3D modeling.
But most of what we use computers for in offices and our day-to-day should work about as well on slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.
We’re using 30 years of hardware advancements to run JavaScript instead of doing new, helpful stuff. 30 years of hardware development to let businesses save a little on software development while pushing a significant multiple larger than that cost onto users.
Early 90's Intel was the 486 33 Mhz. It barely had enough performance to run the TCP/IP stack at a few hundred KB/sec, using all of the CPU just for that task. I think you forgot how slow it was. Pentium II is where it starts to get reasonably modern in the late 90's. Pentium Pro (1995) was their first with multiprocessor support. It was moving so fast back then that early/mid/late 90's was like comparing decades apart at todays pace of improvement.
Not so far removed from a multi-CPU Pentium at 90 or 100MHz, from the very early Pentium days.
I guess what I had in mind was first-gen Pentiums. They’re solidly in the first half of the ‘90s but “early 90s” does cover a broader period, and yeah, 486 wouldn’t quite cut it. They’re the oldest machines I can recall multitasking very comfortably on… given the right software.
Pentium 66 - 1993
Pentium 90 - 1994
Pentium 166 - 1996
More than doubled the performance in 3 years. Two orders of magnitude from 1990 - 2000.
There was no multi-CPU Pentium. Not until the Pentium Pro in 1996.
The nvidia dgx b200 is already selling for half a million. The nearest non tsmc produced competitor doesn’t come close. Imagine no more supply!
If Taiwan ceased to exist, that would put us a decade back.
The gap isn't a decade, more like 12-18 months.
Also, TSMC has 5nm production in the US. There are actual people with know how of this process in the US.
Other companies are using the same equipment (Samsung and Intel) but TSMC has deeper expertise and got more out of the same equipment so far.
TSMC is running a successful business but they're not the only customers of ASML.
It would be a sad thing but not as sad as everything else that would happen in a war.
https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat...
CPUs exist at the center of such a deeply connected mesh of so many other technologies, that the knowledge could be recreated (if needed) from looking at the surrounding tech. All the compiled code out there as sequences of instructions; all the documentation of what instructions do, of pipelining, all the lithography guides and die shots on rando blogs.. info in books still sitting on shelves in public libraries.. I mean come on.
Each to their own!
generally the true problems in life aren’t forgetting how to manufacture products that are the key to human life.