Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
We're Entering a Golden Era of Quantum Computing Research (asmarterplanet.com)
68 points by jonbaer on April 29, 2015 | hide | past | favorite | 19 comments


As an amateur trying to read up on quantum computing, I can say these papers are very accessible. The first is the talk referred to in the article that launched the field, and the latter starts with a great literature review:

Feynman, Richard. "Simulating Physics with Computers," International Journal of Theoretical Physics, vol. 21, Nos. 6/7, 1982.

Shor, Peter W. "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer." 1994.

I'm also nearly done with Scott Aaronson's Quantum Computing Since Democritus, which is a popularizing treatment derived from graduate-level lectures. It is demanding but super fun.

So one risk to quantum computing I wonder about is the error correction. There are lots of attempts at classical computers that purport to give the same polynomial->exponential speedup, but on closer inspection they just "hide" the exponential requirement, e.g. in the need for exponentially-increasing instrument accuracy. Are there any experts here who can give a reason why error correction in quantum computing will scale polynomially or better? It seems like if we are hiding the exponential in quantum computers somewhere, that's where it will be.

EDIT: I think there are practical risks also, e.g. superconductors that only operate at near-absolute-zero, but I'm asking about a theoretical risk that just kills the idea.


> Are there any experts here who can give a reason why error correction in quantum computing will scale polynomially or better?

Because it takes a finite number of Toffoli/Hadamard/whatever gates to error correct local errors.

That is to say it's like check (or ECC) bits in traditional RAM. It's a couple extra transistors per memory location. Error correcting codes in general tend to be very lightweight and can be made to work locally - the the quantum world is the same.


Are there any experts here who can give a reason why error correction in quantum computing will scale polynomially or better?

I'm not sure what kind of scaling you mean in this context, but let me try to answer this.

I think the most important result in error correction is the threshold theorem, which (roughly) says that there exists an N such that if you have qubits that are basically error free for N gate operations, you can implement fault tolerant quantum computation. The particular threshold N depends on specifics of the architecture and error encoding that you use. What also depends on the encoding is the number M of logical qubits per physical qubit.

As far as I know, it is the case that once you've chosen an encoding, the number of physical qubits needed to realize Q logical qubits should just be MQ, so in that sense the scaling is linear. However, different encodings have wildly different thresholds. Shor's original code that could handle bit flip and phase errors had M=9, I believe, but I don't think the threshold was very good. For a long time, a popular figure quoted for the N you needed for error correction was ~10^4. I think there are state of the art codes ("surface codes") that can get the error threshold down to 10^2 or so, which is close to what's achievable in systems with small numbers of superconducting qubits (which is what IBM and Google are looking at.) But, the number M of physical qubits per logical qubit for a surface code is very, very high, I think on the order of 10^5. (It's been a while. EDIT: I got curious and looked up numbers from a 2012 paper; 10^4 is a better number here. That's still three orders of magnitude more qubits than is typical in state of the art circuits.) So in that sense, there is still a scaling problem.


I've had a difficult time getting a real answer on this, an the HN crowd seems pretty well-educated, so I'll pose the question here: Do we even have a high degree of certainty that quantum computing is possible? I thought that Bohmian Mechanics was still on the table as possible description of the physical world.


Bohmiian mechanics is a non-local hidden variable theory of Quantum Mechanics. Any result from traditional QM should hold with Bohmian Mechanics, including the ability to perform quantum computations. We have a reasonable certainty that QM is possible thanks to a Threshold Thereom http://arxiv.org/abs/quant-ph/9702029. Additionally, it is quite easy to run algorithms with a small amount of qubits (5-10) that show that theory holds and should continue to hold as more qubits are added. It has become a scaling problem, which hopefully can be solved with time.


"We have a reasonable certainty that QM is possible", sorry I meant to say QC(Quantum Computing)


It depends on what you mean by possible.

We've run very tiny quantum programs already, such as factoring 15 into 3 and 5, but it's been difficult to scale this upwards and work on larger numbers of bits at a time.

So, possible? Yes, for sure. Practical.... Not quite yet, maybe never, maybe soon.


We don't see why it's not possible, we're just not sure exactly what it's gonna look like. Martinis believes it's gonna be surface codes. My group believes it's gonna be cat codes. Or something. Who knows? It's a hugely difficult experimental physics problem, math problem, and engineering problem all balled up into one. Really cool time to be in this field.

And nobody I know really cares about bohmian mechanics. Only outsiders seem to since it seems edgy. I have some philosophical objections to it as well.


It sounds very cool. I'm just trying to figure whether it's exciting in the way the computer engineering was in 1980, or if it's more like the way fusion energy was in 1960.


Could anyone hope to tell the difference at the time?


There is certainly no consensus among physicists that QC is possible. But I think at this point it is more about reducing the cost of a real implementation down to something manageable. (Ie. current schemes would be 10s of billions of dollars+ to build.)

And as for Bohmian Mechanics, I think this is just a re-interpretation of quantum physics, and not actually a competing theory.


If QC is not possible for some new underlying principles (not lack of engineering capability) then it would mean our understanding of QM is fundamentally wrong. This could be the case (we've been wrong in physics many times) even though QM has been verified to more digits of precision than any other physical theory.

Trying to build computers in this case and diagnosing the errors then are likely to give us the data necessary for the discovery of the new, better theory.

Now, if it is possible in principle but not practical to engineer - that's what we need a golden age of engineering research to find out.


I was interested in it in college probably 10 years ago. Took courses, learned about quantum circuits, bras and kets, bloch spheres (even wrote a 3D java applet to represent one). It seemed like it was going to be a revolution based on Quantum Computing. So compared to the excitement and the hype it hadn't quite happened.


Physics models of quantum phenomena are fantastic and interesting, but there can be a bit of lag between the theoretical developments underpinning the technology and the material science and general technological advances that make such theoretical developments real


anyway, quantum computing seems to be much closer than fusion, like few years away vs. 30. (personally though i'm betting on fusion as we have experimental verification of fusion where is quantum superposition is supposedly confirmed by Bell inequalities experiments and, not having the equipment myself, looking into papers on the experiments i see pretty big holes in them and thus "classical" interpretations of the results)


Yeah, I think one can check the status of quantum computing every 5-10 years, and still keep on being informed. The field has not taken any huge leaps.


The thresholds needed to achieve Quantum Error Correction are extremely close to being achieved nowadays (props in particular to Martini's group - and Google for hiring him). It appears that D-Wave is doing something Quantum - though nobody knows what. Furthermore, DARPA and the Obama administration's initiatives to get Probabilistic Programming adopted by the industry will create both software that can run natively on Quantum Computers and programmers that are able to think in the terms necessary for writing Quantum Software. Lockheed Martin funded research has discovered software algorithms for Quantum Computers that could (not without technical challenges) solve systems of equations that create stealth profiles for fighter jets that are an order of magnitude more effective.

The trajectory is looking pretty good.


The thresholds needed to achieve Quantum Error Correction are extremely close to being achieved nowadays

Yes, they are quite close to surface code thresholds. However, being close to the thresholds means that you need to be on the pessimistic side of how many physical qubits you need per logical qubit. As I mentioned in my other comment, even Martinis (whose group is, as you say, very good) is still several orders of magnitude short in terms of the number of physical qubits needed to implement one logical qubit, let alone the hundreds to thousands of logical qubits necessary to do interesting computations.

Packing more qubits in seems to me like it's going to be a very challenging problem. Looking at Martinis's recent Nature paper, 9 qubits are taking up something like 2 x 4 mm. Making these smaller would be nontrivial for many reasons: they each have their own microwave coupling line (which requires a certain amount of length); the coherence properties of superconducting qubits seem to care about how much surface you have relative to bulk metal, which is a loser for shrinking the devices; presumably jamming them together presents crosstalk issues; etc. Realistically, you also need to add a bunch of other types of electronics down there to handle multiplexing a la D-Wave. I don't know that these obstacles are insuperable, but I'm definitely a little skeptical that this road leads to useful technology.


Will OllyDbg still work on those CPU ;) ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: