Data manipulations alleged in study that paved way for Microsoft's quantum chip
218 points
10 months ago
| 14 comments
| science.org
| HN
krastanov
10 months ago
[-]
This is such a beautiful theoretical idea (a type of "natural" error correction which protects the qubits without having to deal with the exorbitant overhead of error correcting codes). It is very disheartening and discouraging and just plain exhausting that there has been so much "data manipulation" in this subfield (see all the other retracted papers from the last 5 years mentioned in the article). I can only imagine how hard this must be on the junior scientists on the team who have been swept into it without much control.
reply
pc86
10 months ago
[-]
Hopefully people are keeping lists of the PIs on these redacted papers and keeping that in mind for future grants, hiring, etc. I know almost nobody is, but one can hope.

Academic fraud ranging from plagiarism to outright faking data should, more often than not, make it basically impossible for you to get any academic job whatsoever, in your field or others.

This chip is an extreme example, but potentially millions of dollars of productivity, hundreds or even thousands of people spending months or years on something based in a fabrication.

The person or people directly responsible for this should never work again.

reply
jakobgm
10 months ago
[-]
Totally agree! As with any behavior which is difficult to detect and often goes by unnoticed; the punishment should be large enough for the expected value of fraud being clearly net negative for those that might feel tempted at "tweaking some numbers".

In case anybody else also isn't familiar with "PI" as an abbreviation in this context:

> In many countries, the term principal investigator (PI) refers to the holder of an independent grant and the lead researcher for the grant project, usually in the sciences, such as a laboratory study or a clinical trial.

Source: https://en.wikipedia.org/wiki/Principal_investigator

reply
NoMoreNicksLeft
10 months ago
[-]
>Academic fraud ranging from plagiarism to outright faking data should, more often than not, make it basically impossible for you to get any academic job whatsoever, in your field or others.

That might actually be a perverse incentive. If you've already nuked your career with some fraud, you can't make it worse by extra fraud... why ever stop? People inclined to do this sort of thing, when faced with that deterrent just double down and commit even more fraud, they figure the best that can be hoped for is to do it so much and so perfectly that they're never discovered.

The trouble is that the system for science worked well when there exists only some tiny number of scientists, but now we're a planet of 8 billion and where people tell their children they have to go to college and get a STEM degree. Hell, you can only become a scientist by producing new research, even if there's not much left to research in your field. And the only way to maintain that position as a scientist is "to publish or perish". We have finite avenues of research with an ever-growing population of scientists, bullshit is inevitable.

reply
dullcrisp
10 months ago
[-]
You stop because you can’t get a job?
reply
nathan_compton
10 months ago
[-]
My hunch is that these double down types won't be dissuaded by much of anything. I think fundamentally this kind of person has a risk taking personality and often feels they will get away with it.
reply
pc86
10 months ago
[-]
Even if "bullshit is inevitable" is true -- I don't think it is -- that doesn't mean we shouldn't punish people who make up data, who steal others' work, who steal grant money by using their fake data to justify future grants.

"Well there's lots of people now" is not really a great justification. You become a low trust society by allowing trust to deteriorate. That happens in part because you choose not to punish people who violate that trust in the first place.

reply
NoMoreNicksLeft
10 months ago
[-]
>that doesn't mean we shouldn't punish people who make up data,

I am not wishy-washy on punishment. A part of me that I do not deny nor suppress wants punishment for those who do wrong.

But sometimes punishments are counter-productive. The easiest example is the death penalty for heinous, non-murder crimes. This incentivizes the rapist or child molester (or whatever) to kill the victim. You can't execute them twice, after all, so if they're already on the hook for a death penalty crime, murdering their victim also gets rid of a prime witness who could get them the death penalty by testifying, but without increasing the odds of the death penalty.

"Career death penalty" here is like that.

>"Well there's lots of people now" is not really a great justification.

It wasn't meant to be a justification. It was an explanation of the problem, and (in part, at least) and attempt to show that things need to change if we want the fraud to go away.

>You become a low trust society by allowing trust to deteriorate

We've been a low trust society for a long time now. People need to start thinking about how to accomplish the long, slow process of changing a low trust society to a high trust one.

reply
mschuster91
10 months ago
[-]
> We've been a low trust society for a long time now. People need to start thinking about how to accomplish the long, slow process of changing a low trust society to a high trust one.

The core problem is that most people define their self-worth by their employment, and no matter what, this is all going to crash hard due to automation. The generation currently in power is doing everything they can to deny and downplay what is about to happen, instead of helping our societies prepare.

We're all being thrown into the rat race, it is being told to us verbally and in personal experience that there is no alternative than to become the top dog at all costs because that will be the only chance to survive once automation truly hits home. The result is that those who have the feeling they have failed the rat race and have no hope of catching up withdraw from the "societal contract" and just do whatever they want, at the expense of others if need be.

reply
kevinventullo
10 months ago
[-]
In the case of academia, I’m fine with harsh punishment for people who fabricate data, even if it does incentivize them to be more brazen with their fabrications in the short term. Makes it easier to catch them!

The fact is, we don’t want these people in academia at all. You want researchers who are naturally inclined not to fabricate data, not people who only play by the rules because they think they’re otherwise going to get caught.

reply
hollerith
10 months ago
[-]
>We've been a low trust society for a long time now.

Although trust has been decreasing, the US remains a high-trust society compared to the global average.

reply
MeteorMarc
10 months ago
[-]
Or get rehabilitated, like Leo Kouwenhoven, see https://delta.tudelft.nl/article/gerehabiliteerde-kouwenhove...
reply
77pt77
10 months ago
[-]
I missed the redemption.
reply
nextos
10 months ago
[-]
> Academic fraud ranging from plagiarism to outright faking data should, more often than not, make it basically impossible for you to get any academic job whatsoever, in your field or others.

Sadly, the system is often rewarding fake or, especially, exaggerated/misrepresented data and conclusions. I think that a significant proportion of articles exaggerate findings and deliberately cherry-pick data.

It's a market of lemons. Proving misrepresentation is really hard, and the rewards for doing so are immense. Publishing an article in Nature, Science, or Cell is a career-defining moment.

reply
pc86
10 months ago
[-]
Yeah I agree it's not an easy problem to solve by any stretch. I'm not a professor or scientist so I won't pretend to understand the intricacies of journal publication and that sort of thing.

But I do wonder when someone's PhD thesis gets published and it turns out they plagiarized large parts of it, why isn't their degree revoked? When someone is a professor at a prestigious institution and they fabricate data, why are they still teaching the following year?

reply
nextos
10 months ago
[-]
Serious universities do often revoke doctoral degrees if plagiarism is proven. I've seen Oxford University going as far as demanding someone to issue a correction of a journal article to cite prior work because they were making some claims of novelty that were not true.

> When someone is a professor at a prestigious institution and they fabricate data, why are they still teaching the following year?

Internal politics. Committees judging potential misconduct are not independent. If you are sufficiently high up in the ladder, you can get away with many things. Sweden recently created a Swedish National Board for Assessment of Research Misconduct (Npof) to address this problem. I think this is a step in the right direction.

But, ultimately, I think academic fraud should be judged in court. However, e.g. Leonid Schneider (forbetterscience.com) has been taken to court several times for reporting fraud, including fraud that led to patient death, and some judges didn't seem to care much about data fabrication / misrepresentation.

reply
mensetmanusman
10 months ago
[-]
This would be half of the nsf grants according to the replication crisis work.
reply
Panoramix
10 months ago
[-]
Looking at the paper, cherry picking 5 out of 21 devices is in itself not a deal breaker IMO, but it's certainly something they should have disclosed. I bet this happens all the time with these kinds of exotic devices that take almost a year to manufacture only for a single misplaced atom to ruin the whole measurement.

Average of positive and negative Vbias data and many other manipulations are hard to justify, this reeks of "desperate PhD needed to publish at all costs". Yet at the same time I wouldn't fully disqualify the findings, but make the conclusion a lot weaker "there might be something here".

All in all, it's in Microsoft's interests that the data is not cooked. They can only ride on vaporware for so long. Sooner or later the truth will come out; and if Microsoft is burning a lot of cash to lie to everyone, the only loser will be Microsoft.

reply
darth_avocado
10 months ago
[-]
> cherry picking 5 out of 21 devices is in itself not a deal breaker IMO

Might as well draw a straight line through a cloud of data points that look like a dog

reply
crote
10 months ago
[-]
It's a physical device at the bleeding edge of capabilities. Defects are pretty much a guarantee, and getting a working sample is a numbers game. Is it really that strange to not get a 100% yield?

Having 5 working devices out of 21 is normal. The problem is that the other 16 weren't mentioned.

reply
darth_avocado
10 months ago
[-]
Well you also need to account for what kind of deviation are we talking about between the 21. If they selected the 5 because they were the best, but the others showed results that were within say 0-5% of the 5, then sure that is acceptable. But if we’re talking about flipping a coin 21 times, seeing heads 16 times and then choosing the 5 tails outcomes as the results, then I would say that’s pretty unacceptable.
reply
krastanov
10 months ago
[-]
I do not think this is the right metaphor. Having 5 devices work out of 21 is actually a better yield than what TSMC would get with a brand new process. This is not just normal, this is expected. It is all the other allegations that make this be a very questionable case.
reply
Panoramix
10 months ago
[-]
Like I said, a single misplaced atom is enough to wreak havoc in the behaviour of these things. That's not the problem, everyone knows there's a large gap between phenomena observed, and making it consistently manufacturable with high yield.
reply
absolutelastone
10 months ago
[-]
It wasn't just that by itself. There was a list of several undisclosed data tweaks and manipulations. None were particularly fraudulent or anything, but once you have them all included in the paper, as the former author was complaining, it seems more likely that they just manipulated the theory and data as needed to make them match. There's a big difference between predicting something and demonstrating it in experiment, versus showing your theory can be made to fit some data you have been given when you can pick the right adjustments and subset of data.
reply
nathan_compton
10 months ago
[-]
> the only loser will be Microsoft.

Not really - that cash could have been allocated to more productive work and more honest people.

reply
os2warpman
10 months ago
[-]
As far as I can tell the only thing >25 years of development into quantum computing implementations has resulted in is the prodigious consumption of helium-3.

At least with fusion we've gotten some cool lasers, magnets, and test and measurement gear.

reply
Hasz
10 months ago
[-]
This kind of fundamental research though is absolutely worth it. For a fairly small amount of money (on the nation-state scale) you can literally change the world order. Same deal with fusion or other long-term research programs.

Quantum computers are still in a hype bubble right now, but having a "real" functional one (nothing right now is close IMO) is a big a shift as nuclear energy or the transistor.

Even if we don't get a direct result, ancillary research products can still be useful, as you mentioned with fusion.

reply
m101
10 months ago
[-]
People who believe in quantum can put their own money where their mouth is. Plenty of quantum stocks to speculate on out there.
reply
krastanov
10 months ago
[-]
You are right about that (well, except all the progress in classical complexity theory and algorithms, cosmology, condensed matter physics, material science, and sensing, which stemmed from this domain).

But, for the little it is worth, it took much longer between Babbage conceiving of a classical computer and humanity developing the technology to make classical computers reliable. Babbage died before it was possible to build the classical computers he invented.

reply
os2warpman
10 months ago
[-]
If you are going to use Babbage as the start of the clock, we must use the mechanical and electromechanical logarithmic and analytical engines created in the late 1800s/early 1900s as the stop.

We must also use 1980 as the year in which quantum computing was "invented".

As far as progress goes, in all of those fields there are naught but papers that say "quantum computing would be totally rad in these fields" or simulations that are slower than classical computers. (by, like, a lot)

reply
krastanov
10 months ago
[-]
There has been a programmable electromechanical computer build in the late 1800? Not just a simple calculator? Please share examples, this sounds awesome.

Yes, late 1980s is when I would say quantum computing was conceived.

I gave plenty of examples of positive outcomes thanks to quantum information science in my parenthetical. It is much more than the overhyped VC-funded vapor.

reply
dreamcompiler
10 months ago
[-]
When non-tech people ask me whether they should invest in quantum computing I tell them nobody alive today will live to see a return on their investment in quantum computing. Not because it's impossible but because the engineering challenges to make it truly useful outside the laboratory are too great, despite what the press releases would have you believe.

I tell them the same thing about fusion energy.

reply
_heimdall
10 months ago
[-]
Unpopular opinion I'm sure, but I very much quantum today as smoke and mirrors. I've tried to dive down that rabbit hole and I keep finding myself in a sea of theoretical mathematics that seems to fall into the "give me one miracle" category.

I expect this won't be the last time we hear about quantum research that has been foundational to a lot of work turns out to have been manipulated, or designed poorly and unverified by other research labs.

reply
AndrewStephens
10 months ago
[-]
In 2001, a pure quantum computer using Shor's algorithm correctly gave the prime factors of 15. In 2012 they managed to find the prime factors of 21. Since then, everyone has given up on the purely quantum approach by using lots of traditional CPU-time to preprocess the input, somewhat defeating the purpose.

Its a shame. I was really looking forward to finding out what the prime factors of 34 are.

reply
thesz
10 months ago
[-]
https://eprint.iacr.org/2015/1018.pdf

"As pointed out in [57], there has never been a genuine implementation of Shor’s algorithm. The only numbers ever to have been factored by that type of algorithm are 15 and 21, and those factorizations used a simplified version of Shor’s algorithm that requires one to know the factorization in advance..."

If you have a clue what these factors are, you can build an implementation of Shor's algorithm for them, I guess.

reply
EvgeniyZh
10 months ago
[-]
There was for this year's sigbovik [1]

[1] https://fixupx.com/CraigGidney/status/1907199729362186309

reply
JohnKemeny
10 months ago
[-]
If I understand correctly, they didn't actually find the prime factors, they merely verified them, so it's unfortunately up to you to factor 34. Maybe some time in the future a quantum machine can verify whether you were right.
reply
teekert
10 months ago
[-]
It's 2 and 17, I asked Claude.
reply
AndrewStephens
10 months ago
[-]
AI can do that now? Looks like I have to upgrade all of my 5-bit SSL keys.
reply
xxs
10 months ago
[-]
>5-bit SSL keys.

34 requires 6 bits, though

reply
Bootvis
10 months ago
[-]
Hence the urgency.
reply
Lionga
10 months ago
[-]
Not sure if it was more wasteful of energy asking Claude or trying to solve it with Quantum.
reply
RHSeeger
10 months ago
[-]
We could ask Claude to generate the schematics for a quantum computer that can find the prime factors of 21. Then we get the best of both worlds.
reply
solardev
10 months ago
[-]
A twelve year old could do it for 500 kcal of cookies.
reply
tomashubelbauer
10 months ago
[-]
I volunteer as a tribute
reply
solardev
10 months ago
[-]
The twelve year old would probably be too full and tired after eating you to do any math.
reply
dontlaugh
10 months ago
[-]
Humans are typically paid for their labour after they have performed it.
reply
pfdietz
10 months ago
[-]
Reminds me of the Groucho Marx line: "A child of five could understand this. Send someone to fetch a child of five."
reply
rbanffy
10 months ago
[-]
At least quantum computers are cool.
reply
_heimdall
10 months ago
[-]
I believe it was both at the same time, until you check the monthly bill at which point you forced one to become more wasteful.
reply
dreamcompiler
10 months ago
[-]
Gemini told me it was -12 and pi.
reply
xxs
10 months ago
[-]
The infamous 21 (which is half of 42) was my 1st thought when I heard 'unpopular' which is of course a very popular opinion.
reply
bwfan123
10 months ago
[-]
What amazes me is how big tech wants to be in on this bandwagon. There is fomo, and each company announces its own chip that does something - and nobody knows what. The risk of inaction is bigger than the risk of failure.

Meanwhile, a networking company wants to "network" these chips - what does that even mean ? And a gpu company produces a library for computing with quantum.

Smoke-and-mirrors can carry on for a long time, and fool the best of them. Isaac Newton was in on the alchemist bandwagon.

reply
shalg
10 months ago
[-]
There are exactly 2 reasons we might want quantum networks.

1. 100% secure communication channels (even better we can detect any attempt at eavesdropping and whatever information is captured will be useless to the eavesdropper)

2. Building larger quantum computers. A high fidelity quantum network would allow you to compute simultaneously with multiple quantum chips by interfacing them.

The thing that makes quantum networking different from regular networking is that you have to be very careful to not disturb the state of the photons you are sending down the fiber optics.

Im currently doing my PhD building quantum networking devices so im a bit biased but I think it’s pretty cool :).

Now does it matter I’m not sure. Reason 1 isn’t really that useful because encryption is very secure. However if quantum computers start to scale up and some encryption methods get obsoleted this could be nice. Also having encryption that is provably secure would be nice regardless.

Reason 2 at the moment seems like the only path to building large scale quantum computing. Think a datacenter with many networked quantum chips.

reply
nativeit
10 months ago
[-]
> 100% secure communication channels (even better we can detect any attempt at eavesdropping and whatever information is captured will be useless to the eavesdropper) chips. A few follow up questions:

1. What is it about quantum computers that can guarantee 100% secure communication channels?

2. If the communications are 100% secure, why are we worried about eavesdropping?

3. If it can detect eavesdropping, why do we need to concern ourselves with the information they might see/hear? Just respond to the detection.

4. What is it about quantum computing that would make an eavesdroppers’ overheard information useless to them, without also obviating said information to the intended recipients?

This is where the language used to discuss this topic turns into word salad for me. None of the things you said necessarily follow from the things that were said before them, but rather just levied as accepted fact.

reply
SAI_Peregrinus
10 months ago
[-]
1. Nothing. Quantum Key Distribution is what they're talking about, and it still requires P!=NP because there's a classical cryptographic step involved (several, actually). It just allows you to exchange symmetric keys with a party you've used classical cryptography to authenticate, it's vulnerable to MITM attacks otherwise. So you're dependent on classical signatures and PKI to authenticate the endpoints. And you're exchanging classical symmetric keys, so still dependent on the security of classical encryption like AES-GCM.

2. Because they're not 100% secure. Only the key exchange step with an authenticated endpoint is 100% secure.

3. Eavesdropping acts like a denial of service and breaks all communications on the channel.

4. It makes the information useless to everyone, both the eavesdropper and the recipients. Attempting to eavesdrop on a QKD channel randomizes the transmitted data. It's a DOS attack. The easier DOS attack is to break the fiber-optic cable transmitting the light pulses, since every endpoint needs a dedicated fiber to connect to every other endpoint.

reply
staunton
10 months ago
[-]
> Only the key exchange step with an authenticated endpoint is 100% secure.

It's 100% secure in theory, assuming a model of the hardware (which is impossible to verify even if you could build it to "perfectly" satisfy all model assumptions, which of course you also can't).

reply
SAI_Peregrinus
10 months ago
[-]
Yeah, the key exchange portion is secure. The resulting shared secret in RAM, on the other hand, is only as secure as the computer it's on. The moment you're out of the quantum realm by measuring the exchanged quanta, you lose the 100% security guarantee of the quantum portion of the key exchange. The Q part of QKD is actually secure, it's just that it's also useless and QKD as a whole exists mostly to fleece investors. It's a nerdy party trick, not a serious security mechanism.
reply
staunton
10 months ago
[-]
There is no such thing as a magical "quantum realm". Devices performing quantum state preparation or measurements are just devices. They aren't perfect and can never be made to "100%" satisfy any assumptions.

The Q part is secure in theory, assuming your devices satisfy a specific theoretical model. That's not a 100% guarantee. In fact, it's just the same kind of guarantee as we get for any other security system: "We carefully examined the system and it seems like it satisfies the assumptions of our theoretical model, thus promising security".

Not that this is a bad thing, it's just that "quantum" doesn't make anything "magically 100% secure". There's no such thing as "100% security".

reply
SAI_Peregrinus
10 months ago
[-]
Yeah, I should have specified "the photon packet in the fiber" instead of generic "quantum", but there isn't always actually a photon packet even when light is the medium, and there isn't always a fiber, and just mashing it all up as "quantum" was faster. Any interference with the actual stuff that's doing the information exchange will cause the communication to fail, so that one part of the system can't be eavesdropped on passively.
reply
foota
10 months ago
[-]
Sorry, but I think the way you're phrasing this implies a burden on them to explain well understood and widely accepted principles of quantum physics that you seem to be implying are pseudoscience.

This seems like a decent overview if you want to learn more: https://www.chalmers.se/en/centres/wacqt/discover-quantum-te....

reply
_heimdall
10 months ago
[-]
From the source you linked

> According to the laws of quantum physics, it is impossible to measure or copy an unknown state of a quantum particle without noticeably changing it.

That alone is a very clear description of how quantum mechanics is pseudoscience. Its based entirely on an untestable principle. When the initial state can't be measured because doing so changes the state we are left entirely unable to run a controlled study on it. You must know Tue state before and after an intervention to reliably and accurately deduce what happened or to begin to understand why it happened.

This is the one miracle that we must grant to allow the rest of quantum research to become possible.

reply
immibis
10 months ago
[-]
No, it's all well-defined science. There's known mathematics for how the operations you do affect the probability distribution of the answer. The initial state can be prepared. It can't be measured after it's been prepared, because that would ruin it. But so what - that happens all the time in science. Your comment is like saying chemistry is a pseudoscience because if we put a pH indicator strip in before doing a certain reaction to prove it's an acid, the contamination by the indicator chemical stops the reaction from working.

We can simulate a quantum computer using a normal computer (in exponential time). Simulations of tiny quantum computers agree with the experiments using tiny quantum computers. We can also simulate less-tiny (but still pretty small because it takes exponential time) quantum computers. But we haven't built an actual one of those yet. It seems they're really hard to build But also no fundamental reason is known why it should be impossible to build one. Shouldn't it just be the same as a tiny one, but bigger? The tiny ones were hard enough to build, so maybe it's just really hard and we need better techniques.

Perhaps it will turn out to be a failed branch of science that leads to no practical applications, but it's certainly real science, studying real things and making testable predictions (which are true so far). I suppose your next objection will be that since we only have tiny quantum computers, non-tiny problems are pseudoscience, but that's like saying particle physics was pseudoscience before we built the Large Hadron Collider.

Recently 3Blue1Brown made a video attempting to explain Grover's algorithm, which is one of the main applications of quantum computing, that also covers basic ideas of quantum computing and some common misconceptions - have you seen it yet? https://www.youtube.com/watch?v=RQWpF2Gb-gU and followup: https://www.youtube.com/watch?v=Dlsa9EBKDGI

reply
_heimdall
10 months ago
[-]
You're describing a field based on simulations and predictions. That is interesting, but it isn't scientific as you aren't actually testing anything when you only run simulations.

A simulation is an interesting indicator for future scientific research, but it is never scientific research in and of itself.

reply
immibis
10 months ago
[-]
When your predictions are testable and agree with reality you did science.
reply
nativeit
10 months ago
[-]
I feel like most of your answer was just re-stating the question. I’m happy to admit that’s almost certainly a mix of my ignorance on the topic at hand, and I have been primed to view the discussions surrounding quantum computing with suspicion, but either way, that’s the way it reads to this layperson.
reply
thesz
10 months ago
[-]
What is the difference between channel error or distortion and eavesdropping?
reply
staunton
10 months ago
[-]
For eavesdropping, there is someone there who cares about what you're sending and is successfully learning things about it.
reply
thesz
10 months ago
[-]
How can one quantumly differentiate between photon lost due to eavesdropping and photon lost due to some other error?
reply
staunton
10 months ago
[-]
Other than by going around and finding the eavesdropper sitting somewhere, one cannot.
reply
chatmasta
10 months ago
[-]
If studio execs have their way, Quantum DRM will be the killer use case…
reply
autoexec
10 months ago
[-]
Jokes on them, we'll just end up creating and using quantum pirating systems or even the dreaded Quantum Analog Hole to evade it.
reply
cjbgkagh
10 months ago
[-]
AFAIK, in the case of Microsoft, it's less FOMO and more about execs being able to impress their peers at other companies. So not really a fear of missing out but a desire to have an exclusive access to a technology that has already been socialized and widely understood to be impressive. It's a simple message, 'that impressive thing you've been reading about, we're the ones building that'.
reply
jhallenworld
10 months ago
[-]
Also: the big company "thought leaders" need something new to talk about every year at conferences like "Microsoft Ignite" or whatever. These people will push funding into things like quantum research just for this. I'm sure they're getting lots of mileage out of LLMs these days...

I'm maybe a little jaded having worked on whole products that had no market success, but were in fact just so that the company had something new to talk about.

reply
no_wizard
10 months ago
[-]
>Isaac Newton was in on the alchemist bandwagon

An often overlooked or unmentioned fact too!

I think its a shame, because it humanizes the (for lack of a better term) smartest people in history to know these things about them.

Yes, Newton invented calculus, but he also tried to turn lead into gold!

So you too, might be able to do something novel, is the idea.

reply
BlueTemplar
10 months ago
[-]
Let's not forget that the cutting edge of modern science included alchemy at that time : one of the purposes for which the first thermostat was invented was to control alchemical fires.

Or how the "perpetual motion machine" (by the same guy, Cornelis Drebbel) led to barometers and the discovery of atmospheric pressure. (Also connected to astrology because astrologers were the ones making predictions, including of weather.)

EDIT : bonus soundtrack : https://www.youtube.com/watch?v=Jy99ywCp3qU

(> it is only a small subset of this song, a fully differentiable song will come in the future)

reply
rowanG077
10 months ago
[-]
I think it's kinda disingenuous to look at people of the past like that. I expect the exact same curiosity for the unknown and intelligence that allowed Isaac Newton to crack calculus was also the driver for him entering alchemy. The only difference looking back is that one was correct and the other wasn't. Something that no one knew for sure at the time. It's like looking back to the stone age and laughing at the cave men "look how stupid they were why don't they just do X".
reply
sharpshadow
10 months ago
[-]
It’s not only big tech. Since months I’m reading about joint venture types between companies of European countries with state sponsoring in QC. When you follow the path there are a bundle of fresh created companies in every country each claiming a branch like quantum communication, quantum encryption, quantum this.. all working together and cooperating with the same companies in other EU countries.

Still trying to figure out what is going on. Are they preposition for the upcoming breakthroughs and until then it will be like the beginning in AI where many claimed to have it but actually just pretended. Additionally they likely want to access the money flow.

reply
mepian
10 months ago
[-]
It is really desperation, the low-hanging fruit of computing paradigm shifts to fuel the "tech" industry's growth was completely plucked more than a decade ago.
reply
hnthrow90348765
10 months ago
[-]
D-Wave venturing into blockchain stuff raised a red flag for me as a layman investor: https://ir.dwavesys.com/news/news-details/2025/D-Wave-Introd...

There are maybe other reasons to invest, but this caused me to sell my shares

reply
aurareturn
10 months ago
[-]
Looks like they want to become a meme stock.
reply
EvgeniyZh
10 months ago
[-]
> won't be the last time we hear about quantum research that has been foundational to a lot of work

This research wasn't foundational to a lot of work. Most of important/foundational works in quantum (doesn't matter if computing or general, I'm not sure which one you meant) are verified. How can you possibly base your experimental work on someone else's work if you can't replicate it?

reply
_heimdall
10 months ago
[-]
Scientific research today is largely about publishing positive results, we rarely see negative results published and most people focus on publish novel work rather than attempting to validate someone else's work.

I agree with you, its a terrible idea to base your work on someone else's when it hasn't been well confirmed in independent research.

I consider the source work in the OP as foundational because Microsoft built so much work and spent so many resources building on top of it. It's not foundational to the entire field but it is foundational to a lot of follow-up research.

reply
EvgeniyZh
10 months ago
[-]
> I agree with you, its a terrible idea to base your work on someone else's when it hasn't been well confirmed in independent research.

It's not about whether it's good or bad idea. To make follow-up experiments you need to first reproduce the original experiment. That's why faking "big" experiments like Schön could never work.

> Microsoft built so much work and spent so many resources building on top of it. It's not foundational to the entire field but it is foundational to a lot of follow-up research.

Will all due respect, a single group (even large one) doing a single type of experiments (even important and complicated one) is not a lot of research. Also, Microsoft knew about data manipulation, that why they moved the experiments in house. They didn't do experiments under assumption that the early Majorana papers are correct, then they wouldn't need to develop their own complicated (and somewhat controversial) protocol to detect Majoranas. It was quite clear for everyone that regardless of data manipulation people were too optimistic interpreting Majorana signatures in these early papers.

reply
_heimdall
10 months ago
[-]
We can debate what merits consideration as a large body of research, but it seems unhelpful as its entirely subjective.
reply
EvgeniyZh
10 months ago
[-]
The point still stands, Microsoft didn't build their research on top of that, but started their research because of these problems.
reply
andrepd
10 months ago
[-]
Wait just so I'm sure I understand what you're saying: you tried to read but don't understand the mathematics, therefore it's smoke and mirrors.
reply
_heimdall
10 months ago
[-]
I would be lying to say I understood every bit of theoretical mathematics in the field of quantum mechanics, or any field for that matter.

Would you rather I make that blanket claim so you can rightfully call me out for not understanding some part of the mathematical theory or nomenclature instead?

reply
andrepd
10 months ago
[-]
"I don't quite understand internal combustion engines so I'm bearish on automobiles" seems a pretty dumb take, to put it bluntly.
reply
m101
10 months ago
[-]
Perhaps one only needs to realise that reality is not mathematics?
reply
gaze
10 months ago
[-]
Pessimistically I think it's most comparable to fusion. Theoretically possible but very difficult. I'm biased because I'm in the industry, but nothing has cropped up that I've seen that requires a miracle.
reply
naasking
10 months ago
[-]
> I'm biased because I'm in the industry, but nothing has cropped up that I've seen that requires a miracle.

Scaling is itself the open question. Gravitational effects start creeping in when you scale up sensitive entangled systems and we don't have a good understanding of how gravity interacts with entanglement. Entangled systems above a certain size may just be impossible.

reply
cwillu
10 months ago
[-]
That would itself be a tremendous theoretical breakthrough for all of physics.
reply
shrubble
10 months ago
[-]
The difference is that whenever it’s daytime and there aren’t many clouds in the sky, you can see an example of fusion working at scale…
reply
krastanov
10 months ago
[-]
And every time you use a transistor, observe a green plant living, or see that your hand does not pass through the table when you tap it, you see quantum mechanical effects working at scale. Every time you use a telescope, you see quantum information processing (interference) at scale. The control over that process is the difficult part, same as with fusion.
reply
walleeee
10 months ago
[-]
Transistors, plants, hands, and tables are all macroscopic. We see quantum mechanical effects, but we do not see stable superpositions. None of these real-world examples seem to shield a quantum state from decoherence in the way a quantum computer needs to. The sun demonstrates clearly that fusion is controllable (albeit in a regime we struggle to match). I don't think your examples show that a quantum state can be controlled at the scale we need it to be, and I don't know of any extant phenomena that do. But I am no expert, yell at me if I'm wrong.
reply
krastanov
10 months ago
[-]
On the contrary, we have plenty of examples of long-lived (many hours, days, or more) examples of superposition in many solid state materials and chemical systems. For a quantum computer you need both (1) something that can retain superposition and (2) be easily controllable. Point 2 is the difficult one, because if you can control it (interact with it), the "environment" can interact with it and destroy it as well.

All of the examples of macroscopic effects above are possible thanks to effects explainable only through the existence of superposition. It is just that they are not particularly controllable and thus not of interest for storing quantum information.

Another fun point: the example you are focusing on, fusion happening in the sun, is only possible due to the quantum tunneling effect, which is itself dependent on "superposition" being a real thing. Looking past the clouds at our star is already an example of quantum mechanics working, which is very much an experimental observation of an effect possible only thanks to the existence of superposition.

reply
walleeee
10 months ago
[-]
Thanks, I'll stand partially corrected. It was a badly written comment, or muddled thoughts, or both. I didn't mean to doubt the existence of superposition or our ability to achieve it for sustained periods.

But doesn't the point stand? I meant to say that none of the examples seem to demonstrate both 1) and 2). Do we know of any natural system which can maintain and precisely manipulate a quantum state like a quantum computer needs to?

reply
immibis
10 months ago
[-]
Well no, but we also don't know any natural system that does fusion using an amount of material smaller than about a billion billion billion tonnes. Getting from A (what we know works) to B (what we'd like to work) is the hard problem. That's called research.
reply
_heimdall
10 months ago
[-]
In don't think the argument is whether nuclear fusion is possible at all. The question is whether it is possible to reliably control it and whether Tue captured energy output will be worth the inputs required to run the system.
reply
_heimdall
10 months ago
[-]
Quantum state is the miracle in my opinion. By definition it can never really be confirmed.

You cannot observe the initial state because that collapses the super position. Said more simply, we can only see the end result and make educated guesses as to how it happened and what the state was prior to the experiment.

reply
gaze
10 months ago
[-]
There's plenty of valid criticisms of the quantum computing industry but the idea that quantum mechanics is whole-cloth invalid is nonsense. You can absolutely verify a quantum state through tomography
reply
77pt77
10 months ago
[-]
> I very much quantum today as smoke and mirrors

The most accurate and expirimentaly tested theory of reality is "smoke and mirrors".

There are so many other areas to say that about, even in physics. But this?...

reply
tokai
10 months ago
[-]
With the context of the article its clear that GP means quantum computing.
reply
yesbut
10 months ago
[-]
It is all a scam. The research side is interesting for what it is, but the idea of having any type of useful "quantum computer" is sci-fi make believe. The grifters will keep stringing investors and these large corporations along for as long as possible. Total waste of resources.
reply
Mistletoe
10 months ago
[-]
I became disillusioned when I learned that 5x3=15 was the largest number that has been factored by a quantum computer without tricks or scams. Then I became even more disillusioned when I learned the 15 may not be legit…

https://www.reddit.com/r/QuantumComputing/comments/1535lii/w...

reply
cess11
10 months ago
[-]
IBM has given the public access to qubits for close to a decade, including a free tier, and as far as I know it produced a stream of research articles that fizzled out several years ago and nothing generally useful.

https://en.wikipedia.org/wiki/IBM_Quantum_Platform

reply
roflmaostc
10 months ago
[-]
Why do you think so?

Your words sounds like what people said in the 40s and 50s about computers.

reply
os2warpman
10 months ago
[-]
In the 40s and 50s programmable general-purpose electronic computers were solving problems.

Ballistics tables, decryption of enemy messages, and more. Early programmable general-purpose electronic computers, from the moment they were turned on could solve problems in minutes that would take human computers months or years. In the 40s, ENIAC proved the feasibility of thermonuclear weaponry.

By 1957 the promise and peril of computing entered popular culture with the Spencer Tracy and Katharine Hepburn film "Desk Set" where a computer is installed in a library and runs amok, firing everybody, all while romantic shenanigans occur. It was sponsored by IBM and is one of the first instances of product placement in films.

People knew "electronic brains" were the future the second they started spitting out printouts of practically unsolvable problems instantly-- they just didn't (during your timeframe) predict the invention and adoption of the transistor and its miniaturization, which made computers ubiquitous household objects.

Even the quote about the supposed limited market for computers trotted out from time-to-time to demonstrate the hesitance of industry and academia to adopt computers is wrong.

In 1953 when Thomas Watson said that "there's only a market for five computers" what he actually said was "When we developed the IBM 701 we created a customer list of 20 organizations who might want it and because it is so expensive we expected to only sign five deals, but we ended up signing 18" (paraphrased).

Militaries, universities, and industry all wanted all of the programmable general-purpose electronic computers they could afford the second it became available because they all knew that it could solve problems.

Included for comparison is a list of problems that quantum computing has solved:

reply
phire
10 months ago
[-]
Exactly. There were many doubts about exactly how useful computers would be. But they were already proving their usefulness in some fields.

The current state of quantum computers is so much worse than that. It's not just that they have produced zero useful results. It's that when these quantum computers do produce results on toy problems, the creators are having a very hard time even proving the results actually came from quantum effects.

reply
RhysabOweyn
10 months ago
[-]
I don't think that you can really make that comparison. "Conventional" computers had more proven practical usage (especially by nation states) in the 40s/50s than quantum computing does today.
reply
empath75
10 months ago
[-]
By the 1940s and 50s, computers were already being used for practical and useful work, and calculating machines had a _long_ history of being useful, and it didn't take that long between the _idea_ of a calculating machine and having something that people paid for and used because it had practical value.

They've been plugging along at quantum computers for decades now and have not produced a single useful machine (although a lot of the math and science behind it has been useful for theoretical physics).

reply
lucianbr
10 months ago
[-]
Survivor bias. Just because a certain thing seemed like a scam and turned out useful does not mean all things that seem like a scam will turn out useful.
reply
monooso
10 months ago
[-]
GP's comment didn't suggest that every supposed scam will turn out to be useful.

Quite the opposite, in fact. It was pointing out that some supposed scams do turn out to be useful.

reply
mrguyorama
10 months ago
[-]
GP is just blatantly wrong. Electronic computation was NEVER considered a "scam".

The Navy, Air Force, government, private institutions, etc didn't dump billions of funding into computers because they thought they were overrated.

reply
belter
10 months ago
[-]
You can't put a man on the Sun just because you put one on the Moon.
reply
skywhopper
10 months ago
[-]
Do you have any citations for that?
reply
roflmaostc
10 months ago
[-]
reply
reaperducer
10 months ago
[-]
Do you have any citations for that?

I'm not the OP, but when you're of a certain age, you don't need citations for that. Memory serves. And my family was saying those sorts of things and teasing me about being into computers as late as the 1970's.

reply
autoexec
10 months ago
[-]
I can attest to the fact that people who didn't understand computers at all were questioning the value of spending time on them long after the 1970s. The issue is that there are people today who do understand quantum computing that are questioning their value and that's not a great sign.
reply
oldgradstudent
10 months ago
[-]
In the 1970s both my parents were programming computers professionaly.

Computing was already a huge industry. Just IBM's revenues were in the multi billion dollar range in the 1970s. And a billion dollar in the 1970s was A LOT of money in the 1970s.

reply
Henchman21
10 months ago
[-]
When you’re of a certain age, time has likely blurred your memories. Citation becomes more important then. Source: me I’m an old SOB.
reply
reaperducer
10 months ago
[-]
Source: me I’m an old SOB.

By your own criteria, a citation better than "me" is needed.

reply
pessimizer
10 months ago
[-]
Looks like you've got it.
reply
DangitBobby
10 months ago
[-]
I would actually like to read about that, though.
reply
tjpnz
10 months ago
[-]
Quantum annealers have been working on real world problems for a while now - assuming they can be expressed as combinatorial optimization problems of course.
reply
krastanov
10 months ago
[-]
But there is no scalable computational advantage with quantum annealers. They are not what most people in the field would call a "(scalable/digital) quantum computer".
reply
teekert
10 months ago
[-]
Sabine was already skeptical in February [0]. Although to be fair, she usually is :) But in this field, I think it is warranted.

[0]: https://backreaction.blogspot.com/2025/02/microsoft-exaggera...

reply
dsabanin
10 months ago
[-]
Is there something she is not skeptical of or controversial about?
reply
andrepd
10 months ago
[-]
It's her whole schtick: being contrarian. Maybe she had a point about the excesses of string theory but she's been full-on into crackpot territory for a while now, sadly.
reply
hinkley
10 months ago
[-]
I don’t know if she has changed or I have changed but I don’t enjoy her stuff anymore. Maybe I should watch some of her old stuff and figure that out.
reply
teekert
10 months ago
[-]
I’m starting to feel the same indeed.
reply
nlitened
10 months ago
[-]
Einstein equations
reply
dsabanin
10 months ago
[-]
Hm, that may be controversial in itself, depending on where you stand in the current cosmology.
reply
phire
10 months ago
[-]
There was one video where she said she believed time travel was "probably possible"
reply
eqvinox
10 months ago
[-]
Sabine Hossenfelder has grown a little… controversial… lately. You should probably do some googling (or YouTube searching, in this case.) It's not entirely clear to me what's going on but some of her videos do raise serious question marks.
reply
antidumbass
10 months ago
[-]
I've found great success in ignoring, entirely, baseless aspersions cast by faceless anon avatars about people in the public eye.
reply
unaindz
10 months ago
[-]
Good for you, I, on the other side, like when people raise those concerns so I can check for myself if I need to change my views in any way
reply
matkoniecz
10 months ago
[-]
can you be more specific what you are alleging?

and little controversy is not automatically a problem or reason to discount/ignore someone anyway

reply
HideousKojima
10 months ago
[-]
There was an email she claimed to have received many years ago from another academic essentially saying "you're right that a lot of academic research is BS and just a jobs program for academics, but you shouldn't point that out because it's threatening a lot of people's livelihood." Some people are claiming she fabricated this alleged email etc., I haven't looked too much into it myself.
reply
hinkley
10 months ago
[-]
I end up playing father confessor often enough at work that I have had to launder things people have complained about.

When you are trying to make the right calls for a team, you need to know what the pushback is, but the bullies and masochists on the team don’t need to know who specifically brought forward a complaint as long as the leadership accept it as a valid concern.

So if everyone knows I had a private meeting with Mike yesterday it’s going to be pretty fucking obvious that I got this from Mike unless I fib a bit about the details.

Saying a conversation during a visit happened in email sounds like about the sort of thing I might lie about while not making up the conversation from whole cloth.

Not that Sabine is perfect. I’ve let the YouTube algorithm know I want to see less of her stuff. But just because there is no email doesn’t mean there was no conversation.

reply
eqvinox
10 months ago
[-]
No, I'm intentionally not taking a position or alleging anything. I'm pointing out the existence of some controversy. It's up to you to decide whether you want to look into it, and if yes, what sources to prefer.
reply
immibis
10 months ago
[-]
She has a tendency to be wrong on things outside her domain of expertise. It's the classic being an expert in one field and thinking you're an expert in all of them.
reply
dimal
10 months ago
[-]
Please give specific examples. I keep seeing vague comments like this about her, but very little in the way of specifics. Without specifics, this is just ad hominem rumor mongering.
reply
krastanov
10 months ago
[-]
Extreme specifics: her comments on work out of MIT on Color Center Qubits was basically "finally an example of actual progress in quantum computing because of reason A, B, C". That statement was in the class of "not even wrong" -- it was just complete non sequitur. People actually in the fields she comments on frequently laugh at her uninformed nonsense. In this particular case, the people that did the study she praised were also among the ones laughing at her.
reply
eviks
10 months ago
[-]
This is still extreme vague without A,B,C and an explanation why there is no connection, i.e., specifics re why she was wrong. Just more vague reference to other people's reactions
reply
krastanov
10 months ago
[-]
She said "because the color centers are small which would enable miniaturization". Miniaturization is the last thing these are good for, and while they are "small" for a human, they are gigantic for an electronic device. She had absolutely no idea why these new devices are useful but made sweeping comments about how they will change the field. She hyped up something silly while at the same time she complains about hype over actually interesting results. She betrayed complete incompetence on the topic while pretending to be an expert. And she does that constantly, over such trivialities, that it is just exhausting to argue about it.
reply
eviks
10 months ago
[-]
So it's not "because of reason A, B, C", but just "A", and "small which would enable miniaturization" is also not "non sequitur", let alone "complete".

By the way, the "B" was the ability to "target individual qubits more easily". In what way is this "complete non sequitur"

And you also "forgot" to specify "why these new devices are useful" so that we can't check whether she even mentioned it to asses whether she has no idea (is it interconnectedness of the modular systems ("C"), single-step ease of transfer on the CMOS, compatibility with modern semiconductor fab processes, remote control, something else entirely?)

> while pretending to be an expert

Where was that? Could you link to her statement pretending to be an expert on quantum computing?

> that it is just exhausting to argue about it.

Indeed, it's much easier be very vague in your arguments, because then you can't verify any claims and don't have to respond to when such verification fails to match

reply
chermi
10 months ago
[-]
A broken clock....
reply
chermi
10 months ago
[-]
Based on the comments in this thread... Guys, Microsoft fuckery doesn't invalidate an entire field.

I think certain VCs are a little too optimistic about quantum computing timelines, but that doesn't mean it's not steadily progressing. I saw a comment talking about prime factorization from 2001 with some claim that people haven't been working on pure quantum computing since then?

It's really hard. It's still firmly academic, with the peculiar factor that much of it is industry backed. Google quantum was a UCSB research lab turned into a Google branch, while still being powered by grad students. You can begin to see how there's going to be some culture clash and unfortunate pressure to make claims and take research paths atypical of academia (not excusing any fraud, edit: also to be very clear, not accusing Google quantum of anything). It's a hard problem in a funky environment.

1) it's a really hard problem. Anything truly quantum is hard to deal with, especially if you require long coherence times. Consider the entire field of condensed matter (+ some amo). Many of the experiments to measure special quantum properties/confirm theories do so in a destructive manner - I'm not talking only about the quantum measurement problem, I'm talking about the probes themselves physically altering the system such that you can only get one or maybe a few good measurements before the sample is useless. In quantum computing, things need to be cold, isolated, yet still read/write accessible over many many cycles in order to be useful.

2) given the difficulty, there's been many proposals for how to meet the "practical quantum computer" requirement. This ranges from giving up on a true general purpose quantum computer (quantum annealers) to NV vacancies, neutral/ionic lattices, squid/Josephson based,photonic, hybrid system with mechanical resonators, and yeah, topological/anyon shit.

3) It's hard to predict what will actually work, so every approach is a gamble and different groups take different gambles. Some take bigger gambles than the others. Id say topological quantum was a pretty damn big gamble given how new the theory was.

4) Then you need to gradually build up the actually system + infrastructure, validating each subsystem then subsystem interactions and finally full systems. Think system preparation, system readout, system manipulation, isolation, gate design... Each piece of this could be multiple +/- physicist, ece/cse, me, CS PhDs + postdocs amount of work. This is deep expertise and specialization.

4) Then if one approach seems to work, however poorly*, you need to improve it, scale it. Scaling is not guaranteed. This will mean many more PhDs worth trying to improve subsystems.

5) again, this is really hard. Truly, purely quantum systems are very difficult to work with. Classical computing is built on transistors, which operate just fine at room temperature*(plenty of noise, no need for cold isolation) with macroscopic classical observables/manipulations like current, voltage. Yes, transistors work because of quantum effects, and with more recent transistors more directly use quantum effects (tunneling). For example, the "atomic" units of memory are still effectively macroscopic. The systems as a whole are very well described classically, with only practical engineering concerns related to putting things too close together, impurities, heat dissipation. Not to say that any of that is easy at all, but there's no question of principle like "will this even work?"

* With a bunch of people on HN shitting on how poorly + a bunch of other people saying its a full blown quantum computer + probably higher ups trying to make you say it is a real quantum computer or something about quantum supremacy.

*Even in this classical regime think how much effort went into redundancy and encoding/decoding schemes to deal with the very rare bit flips. Now think of what's needed to build a functioning quantum computer at similar scale

No, I don't work in quantum computing, don't invest in it, have no stake in it.

reply
wordpad
10 months ago
[-]
Why couldn't single-user quantum computers be a viable path?

General computing is great, but we built large hadron collider to validate a few specific physics theories, couldn't we we make do with single-use quantum computer for important problems? Prove out some physics simulation, or to break some military encryption or something?

reply
chermi
10 months ago
[-]
Oh sure, I'm all for that in the mean time. But the people funding this are looking for big payoff. I want to be clear that this is not my field and I'm probably a bit behind on the latest, especially on the algorithmic side.

IIRC some of them have done proof of principle solutions to hydrogen atom ground state, for example. I haven't kept up but I'm guessing they've solved more complicated systems by now. I don't know if they've gone beyond ground states.

Taking this particular problem as an example... The challenge, in my mind, is that we already have pretty good classical approaches the problem. Say the limit of current approaches is characterized by something like the number of electrons ( I don't know actual scaling factors) and that number is N_C(lassical). I think the complexity and thus required advances (difficulty) for building special purpose hypothetical quantum ground state solver that can solve the problem for N_Q >> N_C is similar enough to the difficulty required to scale a more general quantum computer to some "problem" size of moderately smaller magnitude that it's probably hard to justify the funding for the special purpose one over the generic one.

I could be way off, and it's very possible there's new algorithms to solve specific problems that I'm unaware of. Such algorithms with an accompanying special purpose quantum computer could make its construction investible in the sense that efficient solutions to problem under consideration are worth enough to offset the cost. Sorry that was very convoluted phrasing but I'm on my phone and I gtg.

reply
russianGuy83829
10 months ago
[-]
that's going to be a banger bobbybroccoli video
reply
christkv
10 months ago
[-]
So the chip is a paperweight ?
reply
nottorp
10 months ago
[-]
Its 5 years away, just like cold fusion and AI.
reply
whynotmaybe
10 months ago
[-]
We've had cold fusion for years : https://en.wikipedia.org/wiki/Adobe_ColdFusion

And while searching for this silly joke, I'm now baffled by the fact that it's still alive !

reply
nottorp
10 months ago
[-]
For every framework that ever existed there's somewhere out there a computer running it and doing real work with it, without any updates since autumn 1988, while the google wannabe solo founders worry about the best crutch^H^H^H^H tooling, their CI/CDs and not scaling.
reply
pelagicAustral
10 months ago
[-]
I propose everything is a paperweight until you show an implementation.
reply
sschueller
10 months ago
[-]
Sadly I have the feeling some people are starting to just "play" being scientists/engineers and not actually doing the real work anymore.
reply
77pt77
10 months ago
[-]
MBA science.

Only perception matters?

reply
nathan_compton
10 months ago
[-]
"Fake it till you make it" was practically the motto of young scientists when I was matriculating. In fairness, I don't think they really meant "fake your research" but our entire incentive/competition based society encourages positive misrepresentation - you can't do science, good or bad, if you get competed out of the system entirely.

Guy Debord wrote a book about what he called "The Society of the Spectacle," wherein he argues that capitalism, mostly by virtue of transforming objects into cash at the point of exchange, (that is, a person can take the money and run) tends to cause all things to become evacuated, reduced as much as possible to their image, rather than their substance.

I believe even GK Chesterton understood this when he said that the purpose of a shovel is to dig, not to be sold, but that capitalism tends to see everything as something to be sold primarily and then as something to be used perhaps secondarily.

There is some truth in all this, I think, though obviously the actual physical requirements of living and doing things place some restrictions on how far we can transform things into their images.

reply
caseyy
10 months ago
[-]
"Fake it till you make it." has turned into "fake it." recently, and it seems to be working disturbingly well in society.
reply
devwastaken
10 months ago
[-]
intellectual property is the entire point of modern tech. it doesnt matter if it doesnt work. they want the IP and sit on it. that way if someone else actually does the work they can claim they own it.

repeal IP laws and regulate tech.

reply
trentnix
10 months ago
[-]
Looks like the end of the world has been delayed.
reply
anonym29
10 months ago
[-]
Microsoft's finest, ladies and gentlemen.
reply
m101
10 months ago
[-]
I bet you quantum computing will go the way of Newtonian physics - wrong and only proven so by our ability to measure things.

It's as if Newton insisted that he was right despite the orbit of mercury being weird, and blaming his telescope.

Physics is just a description of reality. If your business depends on reality being the same as the model, then you're going to have an expensive time.

reply
staunton
10 months ago
[-]
We still use Newton's equations for building rockets (and a lot of other things). A theory being replaced by a better one does not mean devices motivated by the obsoleted theory stop working...
reply
m101
10 months ago
[-]
We use general relativity for anything fast moving I think. Not sure, but pretty sure. GPS wouldn't work with newton. But that's the point, newton mostly works to within an error of measurement
reply
staunton
10 months ago
[-]
> anything fast moving

Examples of using GR are few, GPS being the most prominent one, another being astronomy. Making this point, SR would be a better example.

Maybe a better description would be "we use GR to describe extremely precise time measurements, or the movement of extremely large things". The latter has not yet lead to any devices being built, as far as I know.

reply
m101
10 months ago
[-]
My point with these examples is we're expecting our quantum models to be true at orders of magnitude more precision than we currently have, and is necessary for a quantum computer. It's as if no one even considers that it might hold to within the errors of what we have currently tested. Proof is in the pudding.
reply
staunton
10 months ago
[-]
Arguably, if building quantum computers leads to quantum mechanics being falsified, that completely justifies the effort of trying to build them.

The "disappointing" scenarios would be either that the engineering task turns out to be so hard that it stays "20 years away" for another century, or that quantum computers get built but nobody can find a really impactful and useful application.

reply
satori99
10 months ago
[-]
Indeed

"It turns out that GPS must account for both special relativity and general relativity to deliver position at 1-meter level and time at 100-nanosecond level to its users."

https://www.gpsworld.com/inside-the-box-gps-and-relativity/

reply
thrance
10 months ago
[-]
Uuh no? Quantum computing relies on some aspects of quantum physics, that at this point, have been thoroughly and extensively experimentally verified.

If there are objections to quantum computing, and I believe there are many, those are to be found in questioning the capability of current engineering to build a quantum computer, or the usefulness of one if it could be built.

reply
m101
10 months ago
[-]
As the old saying goes: the proof is in the pudding. And quantum computing has produced zero pudding. All hype, and zero pudding. When they actually do something useful (like the equivalent of general relativity and solving GPS), then we can see it as a useful theory.
reply
Hilift
10 months ago
[-]
From 12 Years Ago: August 26, 2013

"In early May, news reports gushed that a quantum computation device had for the first time outperformed classical computers, solving certain problems thousands of times faster. The media coverage sent ripples of excitement through the technology community. A full-on quantum computer, if ever built, would revolutionize large swathes of computer science, running many algorithms dramatically faster, including one that could crack most encryption protocols in use today.

"Over the following weeks, however, a vigorous controversy surfaced among quantum computation researchers. Experts argued over whether the device, created by D-Wave Systems, in Burnaby, British Columbia, really offers the claimed speedups, whether it works the way the company thinks it does, and even whether it is really harnessing the counterintuitive weirdness of quantum physics, which governs the world of elementary particles such as electrons and photons."

https://metanexus.net/proof-quantum-pudding/

https://spectrum.ieee.org/d-wave-quantum

reply
m101
10 months ago
[-]
And 12 years from now we will be reading the same things. It's just shocking the amount of faith people have in this physics. Physics, to me, is a dying subject. The future is engineering and computing.
reply
m101
10 months ago
[-]
Also, how about thinking about it this way: either there's this magical property of the universe that promises to do these magical things which no one can even dream of achieving otherwise, or there's something wrong with physicists' interpretation of physics. Oh, and they haven't proved it yet but promise they will, very soon.
reply