Initialize the coded bit, using a 59-qubit repetition code that corrects bit flips but not phase errors, in IPython:
In [1]: A = 59 * (False,)
Write a decoder: In [2]: def decode(physbits):
...: return sum(physbits) > len(physbits)/2.0
Wait two hours [0]. I'll be lazy and only decode at the end of the two hours, but if I wanted error correction to get the full advantage, I would periodically run the error correction algorithm and fix detected errors. Now decode it: In [3]: decode(A)
Out[3]: False
Holy cow, it worked!I'm being rather tongue-in-cheek here, of course. But it's genuinely impressive that my laptop can stick 59 bits into DRAM cells containing a handful of electrons each, and all of them are just fine after several hours. And it's really really impressive that this research group got their superconducting qubits to store classical states well enough that their rather fancy error correcting device could keep up and preserve the logical state for two hours. [1]
But this isn't quantum error correction going FOOM, per se. It's classical. A bit-flip-corrected but not phase-flip-corrected qubit is precisely a classical bit, no more, no less.
The authors did also demonstrate that they could do the same trick correcting phase flips and not bit flips, but that's a tiny bit like turning the experiment on its side and getting the same result. Combining both demonstrations is impressive, though -- regardless of whether you look at the DRAM cells in my laptop as though the level is the Z basis or the X basis, they only work in one single basis. You cannot swap the role of level and phase in DRAM and get it to still work. But the researchers did pull that off on their two-hour-half-life device, and I find that quite impressive, and the fact that it worked strongly suggests that their device is genuinely 59 qubits, whereas no one could credibly argue that my laptop contains giga-qubits of DRAM. Fundamentally, you can do classical repetition using a repetition code, but you cannot do quantum computation with it. You need fancier, and more sensitive-to-errors, codes for this, and that's what the second half of the article is about.
[0] I didn't actually wait two hours. But I could have waited a week and gotten the same result.
[1] The researchers' qubits are nowhere near as good as my DRAM. They had to run their error correction a billion times or so during the course of their two hours. (My DRAM refreshes maybe ten thousand times over the course of two hours, and one can look at DRAM refreshes as correcting something a bit like a repetition code.)
This is perhaps not clear enough, but the title refers to a pattern. For classical bits on a quantum computer this pattern is already playing out (as shown in the cited experiments), and for quantum bits I think it's about to play out.
Classical storage of classical bits is still far more reliable, of course. Hell, a rock chucked into one bucket or another is still more reliable. We'll never beat the classical computer at storing classical bits... but the rock in a bucket has some harsh competition coming.
I should maybe also mention that arbitrarily good qubits are a step on the road, not the end. I've seen a few twitter takes making that incorrect extrapolation. We'll still need hundreds of these logical qubits. It's conceivable that quantity also jumps suddenly... but that'd require even more complex block codes to start working (not just surface codes). I'm way less sure if that will happen in the next five years.
Not GP but yes. I'm reasonably confident that we will have quantum computers that are large and stable enough to have a real quantum advantage, but that's mostly because I believe Moore's law is truly dead and we will see a plateau in 'classical' CPU advancement and memory densities.
> I feel like it’s going to get exponentially harder and expensive to get very small incremental gains and that actually beating a classical computer isn’t necessarily feasible (because of all the error correction involved and difficulty in manufacturing a computer with large number of qbits)
I don't think people appreciate or realize that a good chunk of the innovations necessary to "get there" with quantum are traditional (albeit specialized) engineering problems, not new research (but breakthroughs can speed it up). I'm a much bigger fan of the "poking lasers at atoms" style of quantum computer than the superconducting ones for this reason, the engineering is more like building cleaner lasers and better AOMs [0] than trying to figure out how to super cool vats of silicon and copper. It's outside my area of expertise, but I would expect innovations to support better lithography to also benefit these types of systems, though less directly than superconducting.
Source: I worked on hard-realtime control systems for quantum computers in the past. Left because the academic culture can be quite toxic.
Classical CPUs have slowed but not stopped but more importantly quantum machines haven’t even been built yet let alone been proven possible to scale up arbitrarily. Haven’t even demonstrated they can factor 17 faster than a classical computer.
A factor of 1000 may well be the difference between destroying Shor’s-algorithm-prone cryptography and destroying it later, though.
Quantum computers are different. It seems quite unlikely that anyone build the equivalent of, say, an XOR gate that takes two single physical qubits in and spits out two physical qubits (this is quantum land -- the standard gates neither create nor destroy qubits, so the number of inputs and outputs is the same) that works well enough to actually represent that particular operation in whatever software is being run. Instead each logical operation will turn into multiple physical operations that work like an error correcting code. The easy classical trick where your transistor is janky at 0.9V so you run it at 1.0V amounts to moving more electrons around per operation, and this approach is analogous to correcting bit flips but not phase errors, and it makes your quantum computer stop being quantum.
And here's where it gets messy. The physical qubit technologies that are best for longish-term data storage may not be the same as the technologies that are good for computation, and those may not be the same technologies that are good for communication at a distance. (For example, photons are pretty good for transmitting quantum states, but transferring a qubit from a different technology to a photon state and back is not so easy, and demonstration of computation with photons have been pretty limited.) As an extreme example, one can, in principle, store quantum states quite robustly and even compute with them if one can find the correct kind of unobtanium (materials with the appropriate type of non-Abelian anyon surface states), but, last I heard, no one had much of an idea how to get the qubit states off the chip even if such a chip existed.
So it is possible that we'll end up with a quantum computer that doesn't scale, at least for a while. There might be 20k physical qubits, and some code rate, and some number of logical quantum operations you can do on the logical qubits before they decay or you get bored, and very little ability to scale to more than one computer that can split up a computation between them. In that case, the code rate is a big deal.
I would have thought a wide enough array of qubits could functionally do "anything" in one shot
A well studied example is that it's impossible to parallelize the steps in Grover's algorithm. To find a preimage amongst N possibilities, with only black box access, you need Ω(sqrt(N)) sequential steps on the quantum computer [1].
Another well known case is that there's no known way to execute a fault tolerant quantum circuit faster than its reaction depth (other than finding a rewrite that reduces the depth, such as replacing a ripple carry adder with a carry lookahead adder) [2]. There's no known way to make the reaction depth small in general.
Another example is GCD (greatest common divisor). It's conjectured to be an inherently sequential problem (no polylog depth classical circuit) and there's no known quantum circuit for GCD with lower depth than the classical circuits.
https://www.lesswrong.com/posts/tjH8XPxAnr6JRbh7k/hard-takeo...