Settings

Theme

Google plans to reach a Quantum Computing milestone before the year is out

technologyreview.com

147 points by recurser 9 years ago · 51 comments

Reader

mark_l_watson 9 years ago

I was excited to read this article up to the last sentence, a quote from the program manager “We’re trying to get support within Google, and this experiment has been very good to get other engineers talking to us.” to me, this statement negates much of the positive spin of the article. That said, the title does say 'stepping stone.'

My primary professional interest is AI and quantum computing to pushing the field forward seems almost inevitable.

  • nullnilvoid 9 years ago

    I did not bother to read the article. The title looks suspicious to me. For a scientific breakthrough, you don't plan it ahead. Instead, you work hard for a long time and all in a sudden you hit that milestone. It is unpredictable and certainly cannot be planned easily.

    • 21 9 years ago

      The LHC was planned 25 years before it was a reality. In fact, one of the first pictures put on the WWW when it was created at CERN was a schematic of the LHC:

      https://home.cern/sites/home.web.cern.ch/files/image/update-...

      • delecti 9 years ago

        The LHC wasn't a scientific milestone, it was an engineering one. Using the LHC has led to milestones.

        • mr_overalls 9 years ago

          In many fields, for the past several decades, scientific progress has been inextricably bound with technical progress. New tools yield new experiments/discoveries.

        • 21 9 years ago

          But this article also talks about an engineering milestone, there is no scientific milestone in going from a 6 qubit computer or a 49 qubit one.

    • wzeng 9 years ago

      Thinking about this demonstration as an experiment could be a little misleading. It's somewhere between an experiment and a benchmark. Nobody has any reason to expect new physics between 10 and 50 qubits, but actually building and controlling the system is complicated. Quantum computing is in an interesting place right now where engineering milestones and scientific milestones are happening together because the technology is so fundamental.

      That doesn't happen that often and it's one of the most exciting things to be about being in the field. If that interest you, then you might want to come work at Rigetti Computing: "The Tiny Startup Racing Google to Build a Quantum Computing Chip" https://www.technologyreview.com/s/600711/the-tiny-startup-r... We're hiring.

    • 13of40 9 years ago

      I get the impression that quantum computing is already at a point where they understand the science, and the challenge now is how to implement it in a useful and viable way. If that's the case it's fine to say we can't manufacture it now but we'll be there in X months, give or take.

  • RushAndAPush 9 years ago

    I think a lot of computer scientists have a hard time believing quantum computers are possible. Not saying I agree, but I've heard this viewpoint a number of times.

jboggan 9 years ago

I'm happy that this project is coming along. I never had any contact with the physical/chip team but I did attempt to contribute something to the theory group in a 20% project. It was absolutely the best and most challenging work I had during my stint at Google, and I can definitely say the quantum theory group were the brightest minds I met there.

I think they could be on to something with their approaches, both physical and theoretical. I could definitely see a QPU in a few years serving a role in Google Cloud for specific computations. But I don't think that the groups are currently getting the headcount and engineering power that they need, I think the worst bottleneck I saw on the theoretical side was a lack of fulltime engineers to test and implement the researcher's awesome ideas. I'm hoping that gets more attention and resources, because I think Google might actually be sitting on another goldmine here.

johncolanduoni 9 years ago

If you want to read the paper laying out the plan the article is referring to, it's here: https://arxiv.org/abs/1608.00263

smichel17 9 years ago

When "reach a milestone in computing history" is written 4 times before getting to the content (hn title, article title, article summary, first sentence of article), the article feels a lot like click bait, which really discourages me from wanting to give it any of my time. Anyone want to write a tldr?

  • johncolanduoni 9 years ago

    Martinis' group has laid out a plan to build a 49 qubit quantum computer that can solve a problem that is intractable for classical supercomputers with any known algorithm. If they pull it off we'll have the first provisional evidence that we can build quantum computers that can solve a problem classical computers cannot in a reasonable amount of time.

    Caveats: it is possible there exists a superior classical algorithm that can solve the problem efficiently enough, and that we just haven't found it yet. It is also still within the realm of possibility that there exists an algorithm that can simulate any quantum computer with only polynomial slowdown, which would show quantum computers are not drastically faster for any problem.

mck- 9 years ago

I'm surprised there is no mention of D-Wave systems - the quantum computer that Google bought a while ago [1] and how/why they decided to build it themselves from scratch.

[1] https://www.quora.com/How-much-did-Google-pay-for-the-quantu...

frik 9 years ago

Wake me up when it's ready. For two decades we hear announcements that Quantum Computers are almost ready.

Yet beside bluffs and prototypes that require superconductor with very cold -230° C to do little, none of them resurfaced later.

Quantum Computers can be exiting and devastating. What happens if the first company/state keeps it a trade secrete and uses Quantum Computers without telling the rest of the world? (what if...) With Quantum Computers all the "secure digital stuff" of today can be broken, say good bye to your cloud online security. A good/bad actor can read everything. We should think about how to mitigate this issue, isn't it?

  • nradov 9 years ago

    Plenty of people have been thinking about it for years and developing encryption algorithms which should remain secure even if someone manages to build a working quantum computer.

    https://en.m.wikipedia.org/wiki/Post-quantum_cryptography

    • frik 9 years ago

      Good link. Nevertheless HTTPS/HTTP2/TLS/bcrypt/MD5/SHA-256/AES/etc need an upgrade/replacement to be ready for day X, so the internet isn't really prepared yet. Also mind all the previously already leaked encrypted data got harvested by bad actors and is waiting in archives to be easily readable on day Y.

      • wbl 9 years ago

        There is a plan. There already are postquantum encryption schemes (and drafts to use them), and we can wait with signatures for a while. NIST will be making recommendations soon.

skdotdan 9 years ago

Are quantum chips supposed to replace the current CPUs and GPUs? Or are they supposed to be "just" another component that we connect to the CPU, the same way we connect GPUs?

  • lvh 9 years ago

    Almost certainly the latter. QC is very good at highly specific jobs, like factoring or searching. Similarly to GPUs, getting the data from the CPU to the GPU is costly. For now and presumably for a very long time, QC has that problem but way worse. (And it wouldn't give you speed ups yet even if the cost was free -- but Martinis asserts that won't take long.)

    • justinpombrio 9 years ago

      Going a little further:

      For a very select set of problems (factoring and discrete log), quantum computers are exponentially faster than classical computers. For a few (including np-complete problems), they are quadratically faster. For everything else, they're no faster. (When I say "faster", I really mean the runtime of the best known quantum algorithms is better.)

      For the forseeable future, quantum computers will be much smaller than classical computers -- the article is about Google building a 49 bit QC and how that would be a breakthrough. So for the forseeable future, they'll be separate components, used for special cases.

      • lvh 9 years ago

        A little further: factoring and discrete log aren't a complete set (also, that depends on algorithm development). There are a few academic problems that are also exponentially faster, and, more generally to factoring and discrete log: hidden subgroup problems (which are what killed non-supersingular isogeny Diffie-Hellman as a post-quantum key exchange).

  • manyoso 9 years ago

    We are so far from talking about this that your question is nearly meaningless. Quantum computing has not even been demonstrated yet. This "quantum supremacy" is for a highly contrived algorithm and set up that has very little if any real world use cases.

    A generalized quantum computer able to run standard computing algorithms is very far in the future and so much basic research in computing science has to happen before it can be talked about meaningfully.

    • marcosdumay 9 years ago

      > Quantum computing has not even been demonstrated yet.

      That's quite an uninformed comment. Anybody thinking this can start at Wikipedia:

      https://en.wikipedia.org/wiki/Quantum_computing#Timeline

      That list has completely functional computers with up to 4 qbits with results to show.

    • petra 9 years ago

      What about quantum machine learning? assuming quantum computers become powerful tomorrow, won't they be useful in that field?

      • kneel 9 years ago

        Depends on what you mean by powerful and how that can translate to a meaningful calculation.

    • typon 9 years ago

      Pretty sure the only thing quantum computers are better at than classical computers currently is simulating quantum mechanics

      • johncolanduoni 9 years ago

        We actually have a few examples of an absolute speed up for quantum computers, though none of them are exponential and some of them are problems nobody cares to solve. For example, Grover's algorithm is superior (O(sqrt(n)) vs O(n)) to the best possible classical algorithm for the same problem.

        If you are willing to accept comparisons of best known classical algorithms to best known quantum algorithms (without a guarantee that the existing algorithms are ideal) you can add others like factoring to the list, with exponential speed ups.

  • waynecochran 9 years ago

    Only a handful of useful algorithms (currently) that exploit quantum computers (see https://en.wikipedia.org/wiki/Quantum_algorithm). Shor's algorithm for integer factorization is the one that would radically change the current crypto situation (goodbye RSA).

  • aquarin 9 years ago

    QPU perhaps?

apexalpha 9 years ago

Ah, now I can finally digitally simultaneously love and hate Google.

indolering 9 years ago

This is still just an annealer, right? Way cool, just nothing beyond what DWave is offering.

  • greeneggs 9 years ago

    No, it is more than an annealer. I think they are aiming for a 7 x 7 lattice of qubits with nearest-neighbor two-qubit gates and one-qubit operations that are enough for computational universality.

    (The noise rates will still be relatively high, and they might not be able to measure [and reinitialize] qubits at intermediate steps, feeding results back adaptively. Practically, that would limit using fault tolerance.)

  • manyoso 9 years ago

    No, this is the real thing they are trying to achieve.

protomyth 9 years ago

Is there a definitive book explaining Quantum Computing from the ground up?

  • wzeng 9 years ago

    The best book out there is still Quantum Computation and Quantum Information by Nielsen and Chuang. It isn't quite current, as it lacks both the advances in hardware (superconducting qubits) and algorithms (quantum/classical hybrids & the sampling benchmark that this article is talking about). It's still the best way to get started as it introduces everything from the linear algebra all the way up.

    I work in quantum computing and it's the book I always recommend.

  • farresito 9 years ago

    I have heard good things about Quantum Computing Since Democritus, by Scott Aaronson. I'm not sure how rigorous it is, though.

  • Valodim 9 years ago

    I found this blog post very informative. Obviously it doesn't have the content of a book, but it conveys a good first intuition of what quantum computing can and cannot do.

    http://twistedoakstudios.com/blog/Post2644_grovers-quantum-s...

  • waynecochran 9 years ago

    "Algorithms" by Sanjoy Dasgupta has a fantastic description of Shor's algorithm that uses a quantum Fourier transform to factor integers. I love this book. A great place to start.

sanguy 9 years ago

All wasted on finding the best advert to match to your search terms....

faragon 9 years ago

I hope that doesn't finish as a "quantum bubble".

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection