Computers Reduce Efficiency: Case Studies of the Solow Paradox (2023)
97 points
by gtt
16 days ago
| 22 comments
| scottlocklin.wordpress.com
| HN
virtue3
13 days ago
[-]
I think this article is drastically downplaying how dramatically more complicated designs and things are now than they were before.

I don't believe it's computers that are to blame; I believe it's complexity nightmare problem.

We have much tighter tolerances for everything now; everything "does more" and relies on my components.

Back when we used pen and paper to create military vehicles it was mostly JUST about performance and completing the objective. There wasn't thousands upon thousands of other requirements and features (whether or not this is a good thing is debatable).

reply
mcphage
13 days ago
[-]
> Back when we used pen and paper to create military vehicles it was mostly JUST about performance and completing the objective. There wasn't thousands upon thousands of other requirements and features (whether or not this is a good thing is debatable).

It makes me wonder if this piling-on of requirements is also enabled or encouraged by computers. I agree that things are becoming more complex, but I’m thinking computers might be partially to blame for that as well.

reply
giantrobot
13 days ago
[-]
> It makes me wonder if this piling-on of requirements is also enabled or encouraged by computers.

You can build a radar with pen and paper but you can't build an effective stealth bomber with the same. You can't build guided weapons without computers. You can't effectively build hyper-accurate guidance and navigation without computers.

Without computers you're at late-40s early 50s weapon systems. If everyone is stuck at that level you've got some amount of balance in capabilities. The first side/power to apply computers to increase capabilities has a significant advantage over all competitors.

Even if the availability of computers makes things more complex the increased capability is often worth the complexity in aggregate.

The first Gulf Ware is a good object lesson. Iraq had a large and capable military that was several technological generations behind their opponents. The Coalition forces could operate at night with impunity, had unbroken communications behind enemy lines, and could operate in enemy airspace largely uncontested. If the Coalition forces had the same level of technology and dispersion of high technology through the ranks, the Gulf War could have easily been fought to a stalemate or worse.

reply
PopAlongKid
13 days ago
[-]
The U.S. tax code is certainly far more complex than it would otherwise be if not for computers.
reply
chiggsy
13 days ago
[-]
No, the complexity comes from people, both the taxed, and those who impose them, and those who manipulate the legal process for relief. Computers actually give the sums a chance to be somewhat correct.
reply
hulitu
13 days ago
[-]
> everything "does more"

Windows 2000 GUI was much more complex than the Windows 10 or 11 GUI. Yet Windows 10 and 11 have difficulties painting the screen (glitches, black screens for a second, and so on).

reply
paulryanrogers
13 days ago
[-]
W2K didn't support 3D GPUs. IIRC it's hardware support was quite limited, at least at launch. I also remember having to reboot after every driver install or update.

Which isn't to say modern Windows shouldn't do better. Its quality seems to have dropped dramatically since they started outsourcing QA to the fans.

reply
p_l
13 days ago
[-]
The amount of pixels that had to be rendered and pushed, as well as certain lower security in windows 2000, meant that just the pushing of pixels was simpler compared to today. Windows 11 graphic path even for same application code as on windows 2000 is way more complex, and arguably most of that complexity is warranted and needed.

Otherwise you get unfortunate side effects like simple text editor using surprisingly lot of CPU and lagging...

reply
mrob
13 days ago
[-]
The main benefit of 4K displays is sharper vector fonts. It's very important to imitate the exact shapes of print fonts, so despite sharp fonts already being available for decades in bitmap form, we've got to push those pixels.

Likewise, it's very important to know the exact Z-order of the window stack at all times, despite only interacting with one at a time. This means it's absolutely necessary to render translucent drop shadows for each window, which obviously need some Gaussian blur. A simple blitter won't cut it. And better add some subtle translucency to the window decorations too, just in case the shadows are obscured.

Don't forget the mandatory rounded corners, gradient fills, and animations. How can the user know the computer's actually working if things aren't sliding about the screen?

Of course, it's important to remove useless features, like theming. If users are permitted self-expression they might start getting the wrong idea about who really controls the machine.

reply
p_l
12 days ago
[-]
While I'm similarly bitter at how things have gone effectively worse, the actual driver path, assuming same user-space code, involved a significant increase in complexity for good reasons.

Back in Windows NT 4.0 - 5.3 days, GDI would draw direct to VRAM. Fast, simple, but prone to rendering glitches that left corrupted screen unless something would redraw the area.

The amount of pixels was way lower - and we already were using a lot of vector fonts at the time anyway. With higher resolutions, even when you scale by integer value, you need a way faster blitter, and new caching and rendering methods. While GDI had reasonably good hooks for caching, they don't necessarily map well with GPU architecture, and on-GPU blitting is way different than old Windows 2D acceleration architecture that worked fine with GDI - and lower resolutions.

Both for security reasons and to prevent glitches, and honestly to also handle caching & rendering better in modern GPU, you need indirect system between GDI and GPU. Once you have shaders rendering windows contents as texture on triangle strip of 2 triangles, adding blur or transparency was close to zero cost unless you have really resource constrained system (and then you had other issues, really).

And windows had to track exact Z-order since Windows 2.0 introduced overlapping windows, otherwise painter algorithm & gdi caching got confused (it was used to know exactly when to send MSG_PAINT to what window and with what params).

Animations are iffy thing, but usability research suggests that some animations, especially in a world where computers are often very silent and have no indicators (neither HDD sounds nor activity LED, for example), is indeed necessary to help majority of users know when computer is "doing something" or just hung.

As for the last point... I agree 200%.

reply
7thaccount
13 days ago
[-]
Agreed. Complexity could ultimately be our downfall. Everything is drastically more complicated than before and the margins for safety are getting more and more reduced.

Take my own industry of electricity markets for example. It used to be you had large vertically integrated utilities that handled generation of power as well as the transmission of it to the residential grid (distribution). They would run the grid and factor all costs (fixed and variable) into residential and industrial rates. This is easy to explain to someone in a minute or so. In the 1970s and 80s though deregulation took off and you could finally build fairly efficient and smaller gas plants, so there was a push to have these much larger grid operators optimize over a much larger region and introduce competition amongst those in the market, so the public wouldn't suffer from unwise investments from the utilities. This system is more efficient, but is supposed to operate off of a "free market" system. The only problem is that it has never worked very well overall. It does schedule power more efficiently, but you have all these power plants needed for just a handful of events that are no longer solvent as they can't earn enough money in the markets. So the grid operators are dealing with mass scale retirements (some of these would've occured anyway due to EPA rulings) and spending tons of time and money trying to fix a problem that didn't use to exist. These organizations have thousands of pages of legal documents and run enormously complex power auctions and have to employ hundreds of employees to administer all of it. Very few people understand how the cake is made anymore so to speak. Does it save more money? Yes, but the cost is a massive increase in complexity that grows each year as new rules are made. So we took something conceptually simple and made it 10x more complex in order to squeak out more savings. I'm not saying it was the wrong path, but doing this sort of thing all over society/economy has its own costs.

reply
Gibbon1
13 days ago
[-]
Reminds me before the 2008 financial crisis you used to hear finance types and economists crowing about financial innovation. After that they've been more quiet. But they never exactly came out and said what it was. I can tell you it's about efficiently collecting rents. Being able to pick people pocket faster and cheaper so you can pick more pockets. That isn't most people think of when they hear the word innovation.

(Edgy comment over).

(Less edgy comment begins)

The finance share of the US economy went from 4% 50 years to 17% today. It feels likely that there are been zero or less than zero benefit over all from that.

Everyone has a internet connected doorbell now. 50 Yeah ago my parents never locked their front door.

Lots of complexity that seems to provide no overall value.

reply
rini17
13 days ago
[-]
And what about financial services mentioned in the articles? Doubt that "everything does more" there.
reply
RandomLensman
13 days ago
[-]
The referenced article is mainly about about capital efficiency from introducing computers which isn't so surprising as it needs time to (figure out how to) use them effectively.

Nowadays, handling payments, compliance, risk management, trading etc. without computers would be impossible. What is true is that a fair amount of productivity is absorbed on compliance and control issues, though.

reply
pas
13 days ago
[-]
Financial services went from a solid base of institutionalized racism (redlining) and casual prejudice based risk profiling when you went to the branch office ... to credit scores and financial engineering (2008 hello!) and introducing a thousand useless intermediaries, global 0-24 trading and investing, online Regulation A+ offerings (pre-IPO investing for nonaccredited investors), and online banking.

It's a lot more, of course, and there are even a few advantages.

reply
__MatrixMan__
13 days ago
[-]
Oh yeah, also some parts have cancer. Tumors can take a lot to maintain.
reply
constantcrying
13 days ago
[-]
It definitely is true that a design which could easily be created by hand is harder to create on a computer. The things where computers shine are actually managimg the complexity of a large and complicated systems.

What I think the article leaves unspoken (but implied) is the "curse of tools", if you give a person tools he is likely to use them, even if they might not be applicable. Meaning that someone might decide to create a complex solution to a problem, simply because the tools he has been given allow him to do so. I think it is always very important to keep in mind what has been achieved with the very limited tools of the past and the immense ingenuity of the people who worked within those limits.

reply
dimask
13 days ago
[-]
> The things where computers shine are actually managimg the complexity of a large and complicated systems.

I would argue that where computers shine, firstmost, is automating repeated tasks. Even if a task is fairly simple in complexity and doing it by hand takes less time, if you have to repeat the same task over and over it may be beneficial to use computer tools that allow some automation, even if in the first couple of runs this is gonna take more time. In this sense, something being easier to do by hand (once) does not necessarily imply that it is better to do it by hand.

But I do agree that an increase in complexity comes as a curse of tools. People with less tech-understanding may be more easy to get what some benefits of such tools are, but the problems that increased complexity brings takes longer to catch them.

reply
docfort
13 days ago
[-]
Complexity is the outcome of misunderstanding. The misunderstanding can come from lots of areas.

It could be from a requirements perspective: “I understand what I can build easily, but not what you want.”

It could be from an engineering perspective: “I understand what you want, but I don’t understand how to build that cohesively.”

It could be from a scientific perspective: “No one knows what tools we need to investigate this.”

I saw mentioned in other comments that CAD software doesn’t allow for sketching. As someone who was originally trained in drafting the old way, and who has used modern CAD systems to produce models for fantastically large physical simulations, I largely agree that sketching is lost. But the sketching that I can do on paper is just not at the same level of complexity as I can kinda do on my computer.

But the complexity of using the new tool obscures the fact that my model is much more complicated than I could otherwise manage using old tools. And that’s because I’m still learning. In fact, I have to “deploy” while I’m still in learning mode about a problem, unlike before, where I had to understand the problem in order to use the tools to draft the thing.

Being able to do something with a half-formed idea sounds like sketching, but when non-experts rely upon it, it’s pretty fragile. Because it wasn’t done.

Building a memex (something the author disparages multiple times) is super hard because we still don’t understand how to represent ideas separately from language, our original mental sketching tool. But people built Altavista and Google and LLMs anyway. And yeah, they’re super complex.

How does TCP/IP work over wireless connections? Poorly and with a lot of complexity. Why? Because the concept of a connection is ill-defined when communication relies on ephemeral signaling.

But despite the complexity, it is useful and fun to use only half-baked ideas. Just like it’s fun to use language to describe stuff I don’t understand, but merely experience. Graduation. Being a parent to a sick child. Losing a loved one.

reply
pas
13 days ago
[-]
The problem is that we are flooded with low-quality tools.

In general it is almost universally true of software nowadays. (Because change/progress leapfrogged any kind of teleological end-to-end design OR it's simply unmaintained, for example see any US government IT system. Or the ubiquitous extremely fragile corporate synthetic snowflake software that only runs on Windows XP SP1 + that 3 particular patches with that exact vc6.dll copied over from an old CD.)

A good quality information processing tool is designed for the process that it's meant to augment, ideally considering the original problem, and ultimately improving the process itself.

(Just digitizing a form likely leads to worse outcomes. Because those forms were designed for pen and paper. Screens and keyboard navigation requires different flows.

And the usual process is even worse, which consist of reinventing the wheel without context, as in speedrunning through all the problems of corporate politics and procurement, delivering some half-assed milestone-driven monstrosity, and forcing it on employees.

Of course, due to the aforementioned universal internal incompetence-fest and quarter-driven development budgets are bad, required time and effort is underestimated, learning curves are ignored, software gets set in stone too soon, and thus efficiency remains well below the expected, planned, possible, and hoped for.)

reply
littlestymaar
13 days ago
[-]
I think the main reason why we're flooded with low quality tool is due to the fact that with software you just copy-paste your prototype to production: unlike real world system where you build a prototype to show stuff and then have to industrialize it to get to production, for software this expensive (and crucial for quality) step is skipped entirely, and we are all stuck using prototypes…

This and seeing “code written” as the asset, instead of an artifact of “engineers increasing their mastery of the topic”, which is the true asset. But it's very closely related to the first point.

reply
pas
12 days ago
[-]
Yep, the elimination of in-house expertise was ridiculously short-sighted (which is an extremely charitable and euphemistic version of breach of fiduciary duties).
reply
datadrivenangel
13 days ago
[-]
Ideally you computerize away the process as much as possible.

A common failure mode of digital transformation processes is taking the old people process and computerizing it without revisiting the underlying process.

reply
pas
12 days ago
[-]
Other important thing is that paper was able to handle "all the exceptions", notes on margins, corrections, other paper clipped to it, whatever. Sure it was ugly, but it was in-band.

If there's no way to do this in a new system, then it means there will be a parallel semi-official informal shadow of it scattered around in out-of-band channels (emails, phone, paper!), and that leads to more complexity, which kills efficiency.

reply
Nevermark
13 days ago
[-]
Never underestimate the personal satisfaction of spending a day, or a week, installing, playing or configuring one's tools!

In the past, we customized our workflow while in the flow. Now, to approximate that freedom, we have to futz around up front, with the limited control levers they give us. In a slow feedback loop with any actual work, to get our tools to help us in the way we want.

Which for complex tools and work, can rapidly become its own end.

reply
api
13 days ago
[-]
The curse of tools applies in computing too. A virtue of languages like Go is that they reduce incidental and gratuitous complexity through the simplicity of the language. A complex language with a powerful type system like Rust or C++ will tempt programmers to use every facet of the language and create baroque overly complex code.
reply
chiggsy
13 days ago
[-]
Sounds like an argument for C, honestly. After all, that complexity is either in the language, or solved on a case by case basis. Every patch adds complexity, because it reflects changes in the business environment that presents the problem the code is solving.
reply
api
12 days ago
[-]
C is okay except safety. Look at the CVE lists and they are still full of memory errors and it’s 2024.

The problem isn’t that good programmers can’t write good C code, though anyone can make a mistake. The problem is what happens as code ages, is worked on by other people, gets PRs merged, and so on. Bugs creep in and in an unsafe language nothing catches them.

C also makes string processing hellish, but that could be fixed with libraries and isn’t necessarily the fault of the language.

reply
paulsutter
13 days ago
[-]
SpaceX Raptor engine was designed using a full engine combustion simulator, without which the engine would have been impossible [1]. Not to mention the rapid evolution from Raptor 1 to 3 [2]

Jet aircraft are 70% more efficient since 1967, largely from simulation [3], similar in automotive

Unclear how the NVIDIA H100 would have been designed by hand-drawing 80 billion transistors

Net-net: Computers necessary, but we need much better UIs and systems. Maybe AI will help us improve this

[1] https://youtu.be/vYA0f6R5KAI?si=SG1vLMMl8l3DuCYN

[2] https://www.nextbigfuture.com/2024/04/spacex-raptor-3-engine...

[3] https://en.m.wikipedia.org/wiki/Fuel_economy_in_aircraft

> Jet airliners became 70% more fuel efficient between 1967 and 2007, 40% due to improvements in engine efficiency and 30% from airframes.

reply
matheweis
13 days ago
[-]
There are at least two dimensions to this that I believe that the author has overlooked:

1. Economies of scale. It may be that drafting something up in CAD takes more cycles to get right up front, but once you have an established design it is much easier to reproduce it by orders of magnitude

2. Changes in software. Software companies are ever changing their interfaces, decreasing productivity every time their users encounter this learning curve.

reply
pas
13 days ago
[-]
And CAD models benefit from better technology "for free", better visualization, better heuristics (for rule/code/safety/conformance checking), and so on.

Did it make sense for military nuclear submarines back then? Well, maybe not, who knows. (Submarines are definitely not mass produced.)

But what this 'insightful essay' ignores is that productivity decreased overall in the 'West' (as the post-WWII boom ended) but then picked up right around the dot-fucking-com boom. Oh, wait computers. But maybe this glorified shitpost should have used recent datasets instead of spending its eruditeness budget on extra spicy and fancy words. (propitiate!)

https://www.mckinsey.com/~/media/mckinsey/mckinsey%20global%...

https://cepr.org/sites/default/files/styles/flexible_wysiwyg...

https://www.caixabankresearch.com/sites/default/files/styles...

reply
joeatwork
13 days ago
[-]
Something this article leaves out is that mostly, when people are given better tools, they don’t just produce more widgets per unit time: often instead they build different (more complex, better) widgets. When I was in school I read a study about this - a design shop had N draftsmen, they introduced CAD tools anticipating reducing the staff, and when researchers went back to the shop they had the same staff, but they were designing things that wouldn’t have been practical or possible before.
reply
Nevermark
13 days ago
[-]
Under appreciated: Automation creates, or dramatically enhances, the need to fully understand problems and solutions at the most detailed and practical levels. Because automation removes the valuable manual ad hoc flexibility to adapt to most wonkiness.

1. When a job requires a mix of human and computer work, productivity changes are very dependent on interface details. Even one slightly confusing GUI, slowness of feedback, a tool that isn't quite as flexible as a job needs, or an inability to see/edit/use related information at the same time, can greatly harm productivity.

2. When a job is completely automated, productivity can go way up. But this productivity doesn't get attributed to human workers, it is corporate productivity. And then only if this highly optimized task really provides value. There is a lot of performative information processing, with conjectured long term payoffs, serving the needs of management and tech workers to look busy, and believe they are valuable.

For both human and corporate productivity, automation makes it extremely easy to decrease productivity due to the most subtle mismatches between problems and solutions.

When work is done by hand, these mismatches tend to be glaringly obvious, less tolerated, and more easily mitigated or eliminated.

reply
flavaz
13 days ago
[-]
A classic example of this would be how some roles require endless spreadsheets, or individual updates to a CRM tool like Pipedrive.

CRM tools add a lot of overhead to what should be a simple process- letting your manager know what you’re up to.

reply
onthecanposting
13 days ago
[-]
If that bookkeeping overhead is fed into analysis and process mining to drive improvement, it might be a net gain. More often though, I see yet-another-spreadhseet applied as panacea, then it's forgotten in a few months and the process repeats over many years.
reply
AnimalMuppet
13 days ago
[-]
That is being fed into analysis and process mining. But nobody's doing the meta-analysis to look at the overhead of all the analysis tools, and see whether they are a net win.
reply
datadrivenangel
13 days ago
[-]
Big Analysis doesn't want you to think about that.

That is one of the points of the article though: If you double productivity of the worker, but require a second worker to maintain the computer/do the analysis, have you really increased productivity? The answer is yes, but it's important to think about the shape of your work contact patch and how increased productivity increases value.

reply
pnut
13 days ago
[-]
I'm not a CRM end user, but I'd be very grateful for such a tool if I had to suddenly cover for a coworker, or inherit an existing business relationship. What is the alternative, each person individually cobbles together some godawful workflow management system? With no centralised repository of information?

Totally unsustainable, and not at all related to keeping your manager informed.

reply
flavaz
13 days ago
[-]
I think there is a tendency to overcomplicate things, and human nature is such that most of the time colleagues don’t bother to update records properly. That’s the real-world experience CRM salespeople won’t tell you.

What also happens is that we have all these CRM tools in parallel with these “godawful workflow management” systems.

Theoretically there is a productivity gain, sure, but senior executives don’t make use of these tools, they hire a PA. The implementation of these CRM systems is usually done really really badly.

A good superset dashboard on the other hand- then we’re cooking with gas

reply
ElevenLathe
13 days ago
[-]
Good to know sales people have their own version of JIRA hell.
reply
geysersam
13 days ago
[-]
Nobody can convince me computerization has not improved efficiency in industrial manufacturing. But computerization has probably lead to fewer people working in manufacturing. Did overall efficiency decrease or increase?
reply
pydry
13 days ago
[-]
Offshoring did 90% of that, not computers.

The effect of computerization was just to keep some industrial production at home - mostly the unique kind of manufacturing where labor costs weren't dominant.

I think the US is in for an epic shock in few years time when they realize just how much getting cut off from Chinese factories will hurt.

Either demand for labor to substitute those Chinese factories will spike or the US will economically spiral as inflation takes off like a rocket and US elites try yet again to shift the burden onto the politically impotent working classes.

reply
CuriouslyC
13 days ago
[-]
I don't think so. We have major domestic and on-shoring efforts taking place and a massive pool of migrants to draw from for cheap labor in addition to low cost collaborations with Mexico. The cost of trashy consumer goods might spike but the market would quickly supply alternatives for manufacture of important goods.
reply
pydry
13 days ago
[-]
Everything you said would all be absolutely true if we reversed course some time between 2008 and 2017.
reply
gieksosz
13 days ago
[-]
At first I thought it was written in the 80ies, then I saw the author mention 1995 and it began to feel very strange that someone from the mid 90ies would rant against computers. Then I reached a section about LLM …
reply
AndrewKemendo
13 days ago
[-]
I build architectures for major systems and in every case I start with a blank paper in a Strathmore 400series Sketchbook

About 4 years ago I made a wall of my office into a chalkboard and that’s been where I work out massively complex interdependencies and data workflows

Nothing on a computer remotely compares to the speed and specificity of pen or chalk in hand

reply
Almondsetat
13 days ago
[-]
Except a tablet with pen support
reply
AndrewKemendo
13 days ago
[-]
Not even close.

That can break and isn’t perfectly 100% available like my analog method and has much less transportability

reply
nitwit005
13 days ago
[-]
Case studies from the 80s, as productivity started improving again in the 90s.

I can actually remember Alan Greenspan discussing this, despite how young I was.

reply
Manfred
13 days ago
[-]
To be precise, the first study mentioned is from 2011, then 1989, 1987, then studies done in the 1990's.
reply
dahart
13 days ago
[-]
The financial services industry has been revolutionized by computers, I have no idea why the author thought that would make a good example. Today’s stock markets & high frequency trading & online banks & international finance didn’t (and can’t) even exist without computers. The explosion of personal investing and day trading that has changed trading doesn’t exist without computers.

The entire computer industry itself has accelerated and grown because of computers, nowhere has he accounted for the “productivity” attributed to sales of computers. Fields I’ve worked in, video games and CG films, have absolutely increased efficiency with computers: for equal sized productions, the quality has gone up and the workforce needed has gone down over time consistently for decades.

The article has only one single and completely vague datapoint that includes anything from the last 30 years, that’s a major red flag. The invective portmanteaus and insult words are also a red flag and very weak argumentation. Is that supposed to make up for the complete lack of any relevant data? Not to mention some of the insults are worse than iffy by todays standards and don’t reflect well on the author.

Call me rather unconvinced, I guess.

reply
superfunny
13 days ago
[-]
Agreed - I was very surprised to hear this. From Dividend.com:

"In the late 1960s, the volume of trading activity increased dramatically. With the drastic increase in volume, the NYSE had $4 billion in unprocessed transactions by 1968. To catch up, the exchange closed every Wednesday from June 12, 1968 to December 31, 1968. During this crisis, over 100 brokers failed due to the high volume of transaction that could not be processed."

Today, the NYSE processes trading volumes of 3-4 billion shares per day.

reply
rdlecler1
13 days ago
[-]
This analysis ignores the impact of competition. A car produced today is better (and more complex to make) than a car produced in 1980.

Technological productivity isn’t just about improving the number of units or dollar value produced/hours input. Technology can make products more competitive without any increase in productivity by making them better, and, therefore more attractive, to customers even if unit cost or volume stays fixed.

reply
beretguy
13 days ago
[-]
I don’t know. We need to define “better” first. My friend has 1970s F150 that still drives. “For how long will car run so that i don’t have to buy a new one” - that’s my definition of “better”. Will modern cars run 50 years from now?
reply
dahart
13 days ago
[-]
Car longevity has been increasing over time [1], despite anecdotes of a few long-lived vehicles here and there. So, cars are getting better by your metric. They’re also getting better by all other metrics: safety, fuel economy, speed, reliability, conveniences, comfort, style, etc., etc.

I’ve driven a still-running 70’s F-150 recently and owned a 2020s F-150 for a year. There is practically no comparison, I would never buy the 70s vehicle given the choice, they are worlds apart. The 70s truck is uncomfortable, slow, and unsafe in today’s traffic. It is also temperamental and requires more maintenance, cold in the cabin in the winter, hard to see when it’s raining or snowing. Probably the only thing I could say honestly about the 70s truck that’s better in any way is that the purchase price was lower.

[1] https://en.wikipedia.org/wiki/Car_longevity#Statistics

reply
hvs
13 days ago
[-]
The vehicles still running from the 1970's suffer from survivorship bias.
reply
zeroonetwothree
13 days ago
[-]
Modern cars seem to be vastly more reliable than even cars from the 90s. They also generally have nicer features
reply
spit2wind
13 days ago
[-]
Nice meta analysis examining the situation from the perspective management versus measurement: https://www.researchgate.net/publication/4801131_Measurement...
reply
smeej
13 days ago
[-]
I'm noticing a version of this as I pilot switching to a handwritten bullet journal for task management. I'll still sit down at the computer to brainstorm and organize the tasks of my projects, because being able to move list items around is a huge advantage, but when it actually comes to doing the darn things? It's been so much more effective to track them in the notebook. Planning out my daily schedule, figuring out what I can do in which timeframe, and making sure things don't fall through the cracks has worked so much better on paper.
reply
cushychicken
13 days ago
[-]
Something tells me this guy is one of those people who thinks version control is a newfangled process step and not a useful piece of a development cycle.
reply
courseofaction
13 days ago
[-]
There are better explanations for the general downturn of productivity despite better tools: Increased focus on extraction since the expansion of neoliberal policies in the 70s.

https://wtfhappenedin1971.com/

reply
Samtidsfobiker
13 days ago
[-]
I too have wondered why it take so long to make stuff in CAD, and why even suggesting that everything should fit together the first time is laughable at best.

My theory is that computers can't do rough sketching. No CAD software suite (I think) can iterate and evalute rough ideas as fast and flexible as whiteboard pen in a meeting room can.

reply
JKCalhoun
13 days ago
[-]
That's the way I see it.

Just as an example I am familiar with: so many people appear to begin a project like a MAME cabinet with SketchUp.

I like "Cardboard Assisted Design" and have literally built several MAME cabinet prototypes in cardboard where iteration is easy with merely a box-cutter knife.

When the ergonomics and part-fitting is "go", I take measurements from the cardboard proto and move to wood.

Designing acrylic parts for later laser-cutting I have also used "CAD" for prototyping — sometimes even flat-bed scanning the chipboard prototype and then moving to a vector drawing app to overlay the laser-friendly beziers.

Even for PCB layout I often will laser-print the PCB as a final sanity check: punching holes in the paper and shoving the actual electronic components in to see that everything will fit before I send the file off to Taiwan.

reply
nurple
13 days ago
[-]
Your theory seems sound. I see this theme of free-flow expressivity vs stricture and formalism in the choice of programming languages. CAD is like rust, where it never fits together the first time and parametric stricture lines the road of progress with infinite caltrops in abdication to correctness; javascript is a sketch book where ideas and mutations happen easily and often, correctness is a beautiful illusion brought through the evolutionary nature of experimentation.
reply
marcosdumay
13 days ago
[-]
Mechanical CADs aren't designed for maintainability.

They are designed for expressivity first, and easiness of learning (for people with industry knowledge) second. And those are the two only goals.

Just try to adapt a mechanical design for a slightly different task. It's usually better to start from scratch.

(Anyway, that's an example of computers not being fully used. Going from the Solow paradox into "computers are bad" - like lots of people like to do, even here - is just stupid.)

reply
sobellian
13 days ago
[-]
Am I nuts, or does the first graph show exactly the opposite of his claim? He says it shows declining productivity, but labor productivity rises. Costs also rise but this is exactly what one should expect from labor-saving devices, no?
reply
hcks
13 days ago
[-]
“Computa*” this is insane
reply
TazeTSchnitzel
13 days ago
[-]
What's with a lot of the mentions of “computer” using what looks like a portmanteau with “retard”?! I know some famous people like Stallman do this, but I don't think it's perceived positively.
reply
sourcepluck
13 days ago
[-]
Provide a link to Stallman using that term then please? Otherwise, I'm calling nonsense. I'd be 95+% sure you're making that up, or confused in this case.
reply
cedilla
13 days ago
[-]
He's not using "computard" but he loves to undermine his credibility by talking about the "Amazon Swindle" and such.
reply
sourcepluck
12 days ago
[-]
Ok, so, the thing I was replying to was just someone claiming false things as I imagined. Glad we cleared it up.
reply
cedilla
12 days ago
[-]
No, you just misunderstood what OP meant by "this". They meant using portmanteau neologisms, and you thought they meant that specific portmanteau.

No need to be nasty about it.

reply
sourcepluck
10 days ago
[-]
I didn't mean to be nasty - I was, apparently, misunderstanding OP's original message. In my defense, the causal link between these three things:

1. Person A in their blog appears to be using a weird portmanteau, "computard" 2. "Some famous people like Stallman do this" (referring to portmanteaus, and making up words, in general, and not the specific portmanteau person A used) 3. "People" maybe don't perceive that positively (making up portmanteaus..?)

is more than a bit hard to follow.

Anyway, I thought an obviously false claim was being made, but if it's just a few loose amalgamations and people brandishing their opinions about, well that's fine, it is the internet, after all.

reply
TazeTSchnitzel
11 days ago
[-]
Yes, I meant his practice of coming up with silly nicknames for things he dislikes, not that he used this one specifically.
reply
doubloon
13 days ago
[-]
A lack of consideration for others feelings.
reply
eru
13 days ago
[-]
Well, Stallman ain't perceived positively, either.

The author clearly has an axe to grind. I haven't read enough yet to decide whether they have valid point.

reply
rbanffy
13 days ago
[-]
I couldn’t decide whether they had a valid point or whether they were engaging in some form of parody. Seems like an extreme case of Poe’s Law.
reply
GTP
13 days ago
[-]
I think that if you filter-out the exaggerations in the article, the general message is still worth to reflect on. Some process doesn't get more efficient just by virtue of using a computer, and many tools we have now are more about good-looking interfaces than actual productivity.
reply
SiempreViernes
13 days ago
[-]
Sure, every fable has a grain of truth, but constantcrying put it more usefully here: https://news.ycombinator.com/item?id=40264453

The OP is not worth reading unless you desperately miss the "ironic" invective laden writing of the early 00's.

reply
rbanffy
13 days ago
[-]
Not only that, but a lot of the efficiency gains are directed towards tasks that, otherwise, would not be done because it wasn’t worth the cost. Now that the cost is low enough, their marginal gains absorb the resources that’s be freed by automation.

That’s the origin of so-called “bullshit jobs”.

reply
datavirtue
13 days ago
[-]
This.
reply
ChrisMarshallNY
13 days ago
[-]
He states, in his about[0]: "...exposing this sort of nonsense to as much popular contempt as I can muster."

It's deliberate.

I sometimes enjoy folks that write like this, and this chap genuinely seems to be an interesting guy, after my own heart, but I don't really enjoy his writing.

[0] https://scottlocklin.wordpress.com/about/

reply
dopylitty
13 days ago
[-]
For similar reasons to those mentioned in the article it's possible the past century will be seen as a dark age by future humans. Computers are incredibly fragile and depend on complex systems (eg the electricity grid) to even operate. They also can't persist data even across several decades reliably. Yet we've created a society where nothing can be done without a computer and all our data is stored in computers instead of physically.

When those complex systems fail and the computers stop working we'll be left without any traces of the knowledge generated in the past century or the people who generated it. We'll also have lost all the previous knowledge that was moved from physical to digital storage.

All future humans will see from the century is a whole lot of microplastics.

reply
ratsmack
13 days ago
[-]
The problem is that complex systems have not made us more knowledgeable and capable, but instead they have become a crutch.

https://www.palladiummag.com/2023/06/01/complex-systems-wont...

reply
chiggsy
13 days ago
[-]
>By the 1960s, the systematic selection for competence came into direct conflict with the political imperatives of the civil rights movement. During the period from 1961 to 1972, a series of Supreme Court rulings, executive orders, and laws—most critically, the Civil Rights Act of 1964—put meritocracy and the new political imperative of protected-group diversity on a collision course.

Lot of this kind of stuff. Thesis summarized is that America is going to hell in a handbasket because the Feds are demanding that black people get hired once in a blue moon, a practice which dilutes meritocracy, according to the author.

Heh.

>When this was not enough, MIT increased its gender diversity by simply offering jobs to previously rejected female candidates. While no university will admit to letting standards slip for the sake of diversity, no one has offered a serious argument why the new processes produce higher or even equivalent quality faculty as opposed to simply more diverse faculty.

Ah, MIT now pumping out dumb blondes, are they? Having encountered such people as the author of this piece several times before,I have to wonder about the "meritocracy" process in place prior. So "merit" correlates positively with sunscreen purchases, and inversely to tampon expenditure, does it?

>This effect was likely seen in a recent paper by McDonald, Keeves, and Westphal. The paper points out that white male senior leaders reduce their engagement following the appointment of a minority CEO. While it is possible that author Ijeoma Oluo is correct, and that white men have so much unconscious bias raging inside of them that the appointment of a diverse CEO sends them into a tailspin of resentment, there is another more plausible explanation. When boards choose diverse CEOs to make a political statement, high performers who see an organization shifting away from valuing honest performance respond by disengaging.

I mean... is there an actual difference here? First, I'm not convinced it's unconscious bias. Second, this "disengagement" certainly seems like the kind of "meritocracy" we have sadly grown quite familiar with.

>The problem is that complex systems have not made us more knowledgeable and capable, but instead they have become a crutch.

This article kind of sounds like some people want the old crutch back. There is a non trivial question of your fitness for this, however. In the 50's the rise of psychotherapy made it quite clear that people were cracking under the strain of delivering "merit."

I'm quite delighted that this is such a concern to you people. Excellent.

reply
Qwertious
11 days ago
[-]
>>no one has offered a serious argument why the new processes produce higher or even equivalent quality faculty as opposed to simply more diverse faculty.

This is worth debunking more directly: hiring is not and has never been meritocratic, it's heavily affected by networking and cliques (something like 70% of job hires aren't publicly listed, they just ask around if anyone has recommendations). The entire point of diversity hiring is to hire from outside of the existing cliques.

reply