Google plans to reach a Quantum Computing milestone before the year is out
technologyreview.comI was excited to read this article up to the last sentence, a quote from the program manager “We’re trying to get support within Google, and this experiment has been very good to get other engineers talking to us.” to me, this statement negates much of the positive spin of the article. That said, the title does say 'stepping stone.'
My primary professional interest is AI and quantum computing to pushing the field forward seems almost inevitable.
I did not bother to read the article. The title looks suspicious to me. For a scientific breakthrough, you don't plan it ahead. Instead, you work hard for a long time and all in a sudden you hit that milestone. It is unpredictable and certainly cannot be planned easily.
The LHC was planned 25 years before it was a reality. In fact, one of the first pictures put on the WWW when it was created at CERN was a schematic of the LHC:
https://home.cern/sites/home.web.cern.ch/files/image/update-...
The LHC wasn't a scientific milestone, it was an engineering one. Using the LHC has led to milestones.
In many fields, for the past several decades, scientific progress has been inextricably bound with technical progress. New tools yield new experiments/discoveries.
But this article also talks about an engineering milestone, there is no scientific milestone in going from a 6 qubit computer or a 49 qubit one.
Thinking about this demonstration as an experiment could be a little misleading. It's somewhere between an experiment and a benchmark. Nobody has any reason to expect new physics between 10 and 50 qubits, but actually building and controlling the system is complicated. Quantum computing is in an interesting place right now where engineering milestones and scientific milestones are happening together because the technology is so fundamental.
That doesn't happen that often and it's one of the most exciting things to be about being in the field. If that interest you, then you might want to come work at Rigetti Computing: "The Tiny Startup Racing Google to Build a Quantum Computing Chip" https://www.technologyreview.com/s/600711/the-tiny-startup-r... We're hiring.
I get the impression that quantum computing is already at a point where they understand the science, and the challenge now is how to implement it in a useful and viable way. If that's the case it's fine to say we can't manufacture it now but we'll be there in X months, give or take.
It's been in that state for many years at this point.
I think a lot of computer scientists have a hard time believing quantum computers are possible. Not saying I agree, but I've heard this viewpoint a number of times.
Like who?
HN, slashdot, reddit, etc. etc :)
I'm happy that this project is coming along. I never had any contact with the physical/chip team but I did attempt to contribute something to the theory group in a 20% project. It was absolutely the best and most challenging work I had during my stint at Google, and I can definitely say the quantum theory group were the brightest minds I met there.
I think they could be on to something with their approaches, both physical and theoretical. I could definitely see a QPU in a few years serving a role in Google Cloud for specific computations. But I don't think that the groups are currently getting the headcount and engineering power that they need, I think the worst bottleneck I saw on the theoretical side was a lack of fulltime engineers to test and implement the researcher's awesome ideas. I'm hoping that gets more attention and resources, because I think Google might actually be sitting on another goldmine here.
If you want to read the paper laying out the plan the article is referring to, it's here: https://arxiv.org/abs/1608.00263
When "reach a milestone in computing history" is written 4 times before getting to the content (hn title, article title, article summary, first sentence of article), the article feels a lot like click bait, which really discourages me from wanting to give it any of my time. Anyone want to write a tldr?
Martinis' group has laid out a plan to build a 49 qubit quantum computer that can solve a problem that is intractable for classical supercomputers with any known algorithm. If they pull it off we'll have the first provisional evidence that we can build quantum computers that can solve a problem classical computers cannot in a reasonable amount of time.
Caveats: it is possible there exists a superior classical algorithm that can solve the problem efficiently enough, and that we just haven't found it yet. It is also still within the realm of possibility that there exists an algorithm that can simulate any quantum computer with only polynomial slowdown, which would show quantum computers are not drastically faster for any problem.
I'm surprised there is no mention of D-Wave systems - the quantum computer that Google bought a while ago [1] and how/why they decided to build it themselves from scratch.
[1] https://www.quora.com/How-much-did-Google-pay-for-the-quantu...
The D-Wave is not a quantum computer in the meaning that term usually has.
I'm under the impression that Google determined there was nothing quantum about it but is unable to speak up due to a forest of NDA's.
TLDR: DWave doesn't provide a speed-up over classical computers.
Maybe because IIRC D-Wave is still unproven technology.
Wake me up when it's ready. For two decades we hear announcements that Quantum Computers are almost ready.
Yet beside bluffs and prototypes that require superconductor with very cold -230° C to do little, none of them resurfaced later.
Quantum Computers can be exiting and devastating. What happens if the first company/state keeps it a trade secrete and uses Quantum Computers without telling the rest of the world? (what if...) With Quantum Computers all the "secure digital stuff" of today can be broken, say good bye to your cloud online security. A good/bad actor can read everything. We should think about how to mitigate this issue, isn't it?
Plenty of people have been thinking about it for years and developing encryption algorithms which should remain secure even if someone manages to build a working quantum computer.
Good link. Nevertheless HTTPS/HTTP2/TLS/bcrypt/MD5/SHA-256/AES/etc need an upgrade/replacement to be ready for day X, so the internet isn't really prepared yet. Also mind all the previously already leaked encrypted data got harvested by bad actors and is waiting in archives to be easily readable on day Y.
There is a plan. There already are postquantum encryption schemes (and drafts to use them), and we can wait with signatures for a while. NIST will be making recommendations soon.
Are quantum chips supposed to replace the current CPUs and GPUs? Or are they supposed to be "just" another component that we connect to the CPU, the same way we connect GPUs?
Almost certainly the latter. QC is very good at highly specific jobs, like factoring or searching. Similarly to GPUs, getting the data from the CPU to the GPU is costly. For now and presumably for a very long time, QC has that problem but way worse. (And it wouldn't give you speed ups yet even if the cost was free -- but Martinis asserts that won't take long.)
Going a little further:
For a very select set of problems (factoring and discrete log), quantum computers are exponentially faster than classical computers. For a few (including np-complete problems), they are quadratically faster. For everything else, they're no faster. (When I say "faster", I really mean the runtime of the best known quantum algorithms is better.)
For the forseeable future, quantum computers will be much smaller than classical computers -- the article is about Google building a 49 bit QC and how that would be a breakthrough. So for the forseeable future, they'll be separate components, used for special cases.
A little further: factoring and discrete log aren't a complete set (also, that depends on algorithm development). There are a few academic problems that are also exponentially faster, and, more generally to factoring and discrete log: hidden subgroup problems (which are what killed non-supersingular isogeny Diffie-Hellman as a post-quantum key exchange).
We are so far from talking about this that your question is nearly meaningless. Quantum computing has not even been demonstrated yet. This "quantum supremacy" is for a highly contrived algorithm and set up that has very little if any real world use cases.
A generalized quantum computer able to run standard computing algorithms is very far in the future and so much basic research in computing science has to happen before it can be talked about meaningfully.
> Quantum computing has not even been demonstrated yet.
That's quite an uninformed comment. Anybody thinking this can start at Wikipedia:
https://en.wikipedia.org/wiki/Quantum_computing#Timeline
That list has completely functional computers with up to 4 qbits with results to show.
What about quantum machine learning? assuming quantum computers become powerful tomorrow, won't they be useful in that field?
Depends on what you mean by powerful and how that can translate to a meaningful calculation.
Pretty sure the only thing quantum computers are better at than classical computers currently is simulating quantum mechanics
We actually have a few examples of an absolute speed up for quantum computers, though none of them are exponential and some of them are problems nobody cares to solve. For example, Grover's algorithm is superior (O(sqrt(n)) vs O(n)) to the best possible classical algorithm for the same problem.
If you are willing to accept comparisons of best known classical algorithms to best known quantum algorithms (without a guarantee that the existing algorithms are ideal) you can add others like factoring to the list, with exponential speed ups.
Only a handful of useful algorithms (currently) that exploit quantum computers (see https://en.wikipedia.org/wiki/Quantum_algorithm). Shor's algorithm for integer factorization is the one that would radically change the current crypto situation (goodbye RSA).
QPU perhaps?
Ah, now I can finally digitally simultaneously love and hate Google.
Just don't open the box, and we will all be fine.
This is still just an annealer, right? Way cool, just nothing beyond what DWave is offering.
No, it is more than an annealer. I think they are aiming for a 7 x 7 lattice of qubits with nearest-neighbor two-qubit gates and one-qubit operations that are enough for computational universality.
(The noise rates will still be relatively high, and they might not be able to measure [and reinitialize] qubits at intermediate steps, feeding results back adaptively. Practically, that would limit using fault tolerance.)
No, this is the real thing they are trying to achieve.
Is there a definitive book explaining Quantum Computing from the ground up?
The best book out there is still Quantum Computation and Quantum Information by Nielsen and Chuang. It isn't quite current, as it lacks both the advances in hardware (superconducting qubits) and algorithms (quantum/classical hybrids & the sampling benchmark that this article is talking about). It's still the best way to get started as it introduces everything from the linear algebra all the way up.
I work in quantum computing and it's the book I always recommend.
I have heard good things about Quantum Computing Since Democritus, by Scott Aaronson. I'm not sure how rigorous it is, though.
I found this blog post very informative. Obviously it doesn't have the content of a book, but it conveys a good first intuition of what quantum computing can and cannot do.
http://twistedoakstudios.com/blog/Post2644_grovers-quantum-s...
"Algorithms" by Sanjoy Dasgupta has a fantastic description of Shor's algorithm that uses a quantum Fourier transform to factor integers. I love this book. A great place to start.
All wasted on finding the best advert to match to your search terms....
I hope that doesn't finish as a "quantum bubble".