Settings

Theme

A new quantum algorithm for classical mechanics with an exponential speedup

blog.research.google

123 points by ernesto95 2 years ago · 66 comments

Reader

abdullahkhalids 2 years ago

One of the most important insights you take away from a physics undergrad is that you can model much of physical phenomena as a harmonic oscillator. The reason for this is quite simple

1. Every closed system has a fixed total energy, so many systems just settle into an oscillating state, where kinetic energy converts into potential and back.

2. Most real world systems are approximately closed, so they leak energy till they have low total energy (this also follows from the second law).

3. An oscillating system with low total energy can have its potential energy accurately approximated with a quadratic function. Or in other words a harmonic oscillator.

So, while I can't say if there are many interesting/useful coupled classical oscillator systems that need an exponential speedup for us to study, it is nevertheless exciting to hear that such systems do admit a quantum speedup.

  • pa7x1 2 years ago

    This is overly complicated. The reason harmonic oscillators pop up everywhere is even simpler and more general than that.

    It models first order perturbations over a stable equilibrium. For sufficiently small perturbations around a stable equilibrium everything is an harmonic oscillator.

    It's basically taking the first order perturbation of a Taylor expansion around a local minima.

    • robertlagrant 2 years ago

      > It's basically taking the first order perturbation of a Taylor expansion around a local minima.

      I'm not sure that's less complicated, at least for my level understanding.

      • arbitrandomuser 2 years ago

        > > It's basically taking the first order perturbation of a Taylor expansion around a local minima. > > I'm not sure that's less complicated, at least for my level understanding.

        You know Taylor expansions ? So every complicated function of x can be written as soe expansion of infinite powers of x, a0 +a1 ( x )+a2 ( x^2 ) ... And so on Every complicated force can then be written as a function of displacement x like this , now if x is small ignore the sqaure and higher order terms what you get is the linear part , which looks something like F = a1x + a0 , the a0 part only shifts the default position (you can rewrite x as x-c so that a0 goes to zero,) lo and behold you have F=Kx everything is like your familiar hooks law spring as long as x is small enough.

        • ridiculous_fish 2 years ago

          This is on the right track but not entirely right. In your F = a1x + a0, here a0 is a force, not a position, so you cannot neglect it on the basis of "default position." Instead you set it to zero because the system is at a stable equilibrium.

          Here's the presentation I've seen. Usually we like to work in potentials, not forces, because potentials are nice scalar functions, while force is an ugly vector function. So say you have a potential V which is at a local minimum at position x.

          Expand the potential at x around a small displacement dx: V(x + dx). This gives us the Taylor series V(x+dx) = V(x) + a1 V'(x) dx + a2 V''(x) dx^2 + a3 V'''(x) dx^3...

          We can neglect V(x) since it's just a constant, and adding a constant to the potential does not affect the physics. And (the crux) we can neglect V'(x) because the potential is at a minimum, so the derivative is zero.

          That leaves the quadratic and higher-order terms. Neglecting the higher order terms on the basis that dx is small, we get the harmonic potential, or Hooke's Law in the language of forces.

      • thfuran 2 years ago

        Most things jiggle for a while if you poke them a bit?

        • godelski 2 years ago

          Most things jiggle even if you don't poke them. In fact, for them to not jiggle you'd have to be at absolute zero (not technically correct, but good enough. Damn you quantum and your jigglyness)

        • taneq 2 years ago

          Yep pretty much. Everything's jiggling a bit, if you poke them they jiggle, if you poke them twice has hard they jiggle twice as much, if you don't then they gradually slow down.

    • abdullahkhalids 2 years ago

      That's what I said, without using the word Taylor expansion, as this audience might not necessarily know it. I also explained in points 1 and 2 why many systems have can be modeled via small perturbations (i.e. Taylor approximation) around a stable equilibrium - they need to have low energy.

      Also, just a reminder that it is the second-order term in the Taylor expansion that is relevant for harmonic oscillators. Zeroth-order term (constant) does not determine the dynamics. The first-order term (linear) is zero only for low energies, as any potential well at sufficiently low energies will be symmetric. The second-order term (quadratic) is what provides the restoring force towards the equilibrium.

      • pa7x1 2 years ago

        The basic difference is pointing that there is something more fundamental and it doesn't have to do with physics. It has to do with math, with calculus, and with approximating complex functions via perturbative expansions. This is very general and applies to other kinds of systems, not necessarily physical. That's the key insight.

        The first relevant term is quadratic in the potential around a stable equilibrium but linear in the force. That's why they are called linear harmonic oscillators, I thought that was obvious in my description.

      • godelski 2 years ago

        Honestly I kinda laughed when I saw the response because it reminded me of the classic nerd flexing (that happens especially when conveying something to laymen) where two nerds go at it saying the same thing but using heavier and heavier jargon as a means to flex (increasingly losing the laymen) rather than clarify or admit that language requires inference and the whole show is ironically just a demonstration at accurate communication or else one would not have been able to "um acktually" the other. (I think a lot of us have been guilty of this, often unintentionally, but we can still laugh at ourselves)

    • lagrange77 2 years ago

      > It models first order perturbations over a stable equilibrium. For sufficiently small perturbations around a stable equilibrium everything is an harmonic oscillator.

      That's the same reason, why we linearize nonlinear systems around the equilibria to apply linear control theory, right?

      While in control, this makes sense to me, since the goal is often to stabilize the system, how does this help with modeling the whole system in general (far away from any equilibrium point)?

      • pa7x1 2 years ago

        Yes, near a stable equilibrium you can always linearize a system to model its behavior quite accurately for small perturbations. That's the crux of what it was pointed above.

        Outside of equilibrium things get more complicated. In principle, you can still do a first order expansion to understand the dynamics in a vicinity of that regime, the problem is that outside equilibrium you are not going to stay near the point of the solution space you started at. You will keep drifting, at least until you reach another stable equilibrium if there is one.

        Systems outside equilibrium are much harder to study because we cannot linearize. Basically.

    • dustingetz 2 years ago

      seems circular - if you limit the world to small perturbations over a stable equilibrium of course you get simple harmonic motion, push a spring down a bit it pushes back and starts bouncing, the interesting insight is why can you limit the world to that?

      i suppose it’s because most interesting forces are conservative, i.e. if motion doesn’t dissipate energy (like friction, unlike electromagnetism) then energy is conserved so perturbations must bounce. OP contains the key insight- dissipating systems dissipate proportional to total energy, so low energies are approximately stable, IIUC.

    • godelski 2 years ago

      I don't think fewer words means less complicated. I know we're on HN but I don't think most of the crowd here is going to be familiar with dynamic systems to parse your words and certainly not DEs or PDEs. I mean you won't get to this till what, Classical Mech in 3rd year of a physics undergrad? Maybe I'm too pessimistic but I feel like having sufficient background to parse your statement implies sufficient background to already have this knowledge. I definitely think OP's version is far clearer to the layman.

  • stared 2 years ago

    There is a saying that "physics is a subset of reality that can be approximated by coupled harmonic oscillators".

    If they are purely linear - it is simple; just write a matrix of their spring coefficients and diagonalize it. If there are non-linear terms, well, then it isn't. This is the case in one of the Quantum Field Theory (and its instance, the Standard Model), the most fundamental theory describing particles. One of the takes on Feynman diagrams is that it is a series approximation of non-linear interactions in the QFT.

  • mjburgess 2 years ago

    Well that's an illusion of physics education -- you can model almost nothing in the world this way. Rather we engineer circumstances (devices, experiments, etc.) where nature is forced to imitate this otherwise useless approximation.

    And mostly this fails. You cant really analyse the world using simple closed-form formula -- all the stuff that this worked for is studied and reported in books.

    • ndriscoll 2 years ago

      You can model any conservative system with stable equilibria this way (i.e. anything with local minima in the potential). As another poster pointed out, just do a Taylor expansion around the equilibrium state, and you have a locally quadratic potential, i.e. a harmonic oscillator.

      Explicitly, WLOG the potential is `V(x) = V_0 + V_1 (x-x_0) + V_2 (x-x_0)^2 + ...`. But the first derivative is 0 near a minimum, so V_1 = 0. The constant V_0 term doesn't affect dynamics, so choose it to be 0. Then the higher order terms go to 0 as x->x_0, so you have `V(x) = V_2 x^2` near any stable point.

      And it's pretty obvious that lots of real-world phenomena 1. do have stable states and 2. do oscillate around them before settling, so the model is pretty good for lots of real-world systems (with the error being that the system is generally not conservative, so eventually it settles into the equilibrium because friction is stealing the energy).

      • mjburgess 2 years ago

        How much of reality is a stable system under equilibrium?

        • ndriscoll 2 years ago

          Look around you. Is the building you're in falling down? Or is it standing still? Are the things on your desk bouncing about, or are they just sitting there? Are the atoms in your body fissioning, or are you still here? Stable systems are extremely common. The world would be a much more violent place to live in if they weren't.

          And it makes sense that stable points exist. That means there's a well somewhere in the potential (and the stable point is at the bottom of the well). It seems reasonable that if you didn't know anything about the potential of a system, you might suppose it has at least one squiggle somewhere, and in that squiggle you'll find your equilibrium.

          Edit: I guess actually it's not so obvious in the higher dimensional case since you can have saddles. I'd have to go back and look at my nonlinear diffeq book to see, but there might be some topological result (similar to the hairy ball theorem) that explains it, at least for compact configuration spaces.

          • smallnamespace 2 years ago

            I think your explanation only seems clear because you’re eliding some quite relevant detail

            > Look around you. Is the building you're in falling down?

            You can say it’s stable, or that the building is in the act of slowly falling down without human input (maintenance). It’s a bit misleading to sneak in time scale.

            > Are the things on your desk bouncing about, or are they just sitting there?

            At a small enough scale bits of the desk are bouncing around due to thermal energy. Also same as the house, your desk will eventually decay into dust due to that bouncing.

            > Are the atoms in your body fissioning, or are you still there?

            This one is actually stable.

            The other two systems are better characterized as metastable - they appear stable at certain time and length scales. In fact metastability can be pretty tricky to explain, and sometimes requires a detour into thermodynamics or other fields (e.g. biology if you’re asking why a person’s body seems rather stable).

            • ndriscoll 2 years ago

              Sure but the point is in a closed system, thermodynamic fluctuations alone are going to take eons to make your house fall down. What will actually get it is that the system is not fully closed, and eventually something from outside the system will give it the activation energy needed to escape the well it's in (e.g. a windstorm causing a tree to fall on it). But it's useful and practical to think of that larger system as decomposing into a product of pretty much independent systems (until it isn't).

              The point is wells exist and are common in day-to-day life, including ones that are deep enough that things don't randomly thermodynamically pop out of them. If something is sitting on your desk when you go to bed, you can be pretty certain it'll be there in the morning too. The scale of the well is orders of magnitude larger than the thermal fluctuations, even if eventually a fluctuation could excite your computer monitor to suddenly vaporize. The atoms in your body and all matter will eventually decay to nothing too, but it's not useful to think about that.

              And even if it's metastable, as long as the system stays inside the well, the potential is locally quadratic. So you see oscillations around the equilibrium in the meantime.

              • mjburgess 2 years ago

                Almost all of reality isnt day-to-day life --- indeed what that describes is, in many ways, exactly the sort of illusions of stability that admit cute mathematical analysis.

                You're weighting parts of reality by their relevance to a us at a particular place and time -- without such prejudice you find that very little admits of this sort of cute mathematical description.

                And that which does is now pretty exhausted as far as research goes. Describing beds is not a pressing theoretical quesiton

                Describing organic systems, say is -- chaotic organic development across trillions of cells. There are no SHOs there

                • hughesjj 2 years ago

                  Really? What about the Krebs cycle?

                  • mjburgess 2 years ago

                    There is no first order taylor series expansion of the krebs cycle -- indeed not.

                    As soon as you have three of anythign physics, as applied math, breaks down

            • zopa 2 years ago

              This is where we need to decide if we're doing physics or philosophy. Physics is about answering specific, fairly practical questions, and the questions tell you what parts of reality you can afford to ignore. If you want to get at the true nature of things in a metaphysically satisfying sense, physics will always disappoint you.

              Is it part of the essential nature of a building to collapse? Sure, I suppose, as with all things. But I'm only going to be in this this coffee shop for another hour: to me, for my purposes, it's stable. If instead I were buying the building I'd want a much more detailed model that considered termites and the risk of earthquakes and all sorts of things, but I still wouldn't care about the date of the next ice age that will scour the landscape clean.

              One thing to keep in mind is that our mathematical methods might as well be approximate, because we our measurements always will be. Precise calculations on fuzzy data are a waste of time. Not that there's anything wrong with philosophy! But it's not super relevant to understanding where harmonic oscillators are a useful model.

        • godelski 2 years ago

          The vast majority of it. By definition. Unstable equilibria are unstable to perturbations. By definition.

  • noduerme 2 years ago

    Would a 3-or-more body gravitational problem be one of these that could use a speed up?

    • AnimalMuppet 2 years ago

      Probably not (at least by this breakthrough). You can't model a 3-or-more body gravitational problem as coupled harmonic oscillators.

    • dchftcs 2 years ago

      For that, accuracy seems to be a stronger limiting factor than speed.

      • noduerme 2 years ago

        My lay understanding of the problem with classical algorithms is basically that a lack of resolution means you need to monte carlo the thing millions of times... which is why it's slow. If you could model it as a set of quantum states of similar inaccuracy, wouldn't that by definition be just as (in)accurate but faster?

        [edit] this reminds me of something I read about how NASA doesn't predict solar eclipses by trying to keep an exact model of the solar system, but rather uses pattern matching algorithms.

        • valine 2 years ago

          I don’t think there’s many three body problems in the real world where compute is the limiting factor. If you’re projecting so far out into the future that simulation speed is a problem, you’re initial measurement error will have compounded to the point that your solution is meaningless.

          We struggle to predict the exact path of asteroids because of measurement errors, not because computing is slow. Minuscule changes to the initial condition manifest as massive differences in the outcome.

        • fgoesbrrr 2 years ago

          People were predicting solar eclipses thousands of years ago with no calculators or even a modern understanding of math. Can't be that hard.

          • lippel82 2 years ago

            The reason this worked is, that the back-action of the moon on the sun is very insignificant. For a "real" chaotic three-body system, you need three bodies that interact with each other on comparable scales.

          • noduerme 2 years ago

            I found the article I was referencing[0]

            >> In summary, it is clear ancient people could predict timings for lunar eclipses and partial solar eclipses, but there is no convincing evidence of people predicting the times and locations of total solar eclipses.

            >> Today, we don’t rely on calculating the orbits of the whole Solar System to predict eclipses. For example, NASA uses a highly advanced form of an ancient technique – pattern recognition. Using some 38,000 repeating mathematical terms, NASA can predict both solar and lunar eclipses for 1,000 years into the future. Beyond that, the Moon’s wobble and Earth’s changing rotation make eclipse prediction less accurate.

            [0] https://www.astronomy.com/observing/humans-have-been-predict...

marktangotango 2 years ago

Very interesting indeed:

> discovery of a new quantum algorithm that offers an exponential advantage for simulating coupled classical harmonic oscillators.

> To enable the simulation of a large number of coupled harmonic oscillators, we came up with a mapping that encodes the positions and velocities of all masses and springs into the quantum wavefunction of a system of qubits. Since the number of parameters describing the wavefunction of a system of qubits grows exponentially with the number of qubits, we can encode the information of N balls into a quantum mechanical system of only about log(N) qubits.

cvoss 2 years ago

The approach maps the classical mechanics of coupled harmonic oscillators to a particular quantum system. I'm curious if there is some (unrelated) classical system whose quantization is that same quantum system. In other words, does this quantum system have a natural interpretation as the quantum mechanical version of some classical system? If so, it's presumably very different from (e.g., a lot smaller than?) the oscillator system which motivated the study of the quantum system.

vimax 2 years ago

The thing that stands out to me is the described proof of BQP-completeness where they say they prove any quantum system can be similarities as balls and springs, but later they say you may need an exponential number of springs for a classical simulation. That sounds like the BQP reduction would be exponential, guess I'll have to read the paper to see what I'm missing.

  • eigenket 2 years ago

    They have exponentially many classical oscillators, but then they simulate them with their quantum algorithm with exponential speed up. The two exponential factors cancel and you end up with a quantum algorithm for a BQP complete problem which runs in polynomial time.

  • chermi 2 years ago

    Just a gut feeling but it could be related to the ability to map so much stuff to path integrals with harmonic fields. I too need to read it better.

IIAOPSW 2 years ago

I'm skeptical. If you have an exponential speedup for simulations of coupled oscillators, you can rig a system of coupled oscillators into a general purpose computer [0] and therefore have an exponential speedup for any computation. That seems too good to be true.

[0] https://www.zyvex.com/nanotech/mechano.html

  • eigenket 2 years ago

    The logic gates described in that link are more complicated than just coupled harmonic oscillators. They have things like ratchets, or they block each other, or there is buckling.

    The quantum algorithm couldn't simulate these things.

  • jksk61 2 years ago

    If you need an exponential number of coupled oscillators to construct a general purpose computer, then you don't have the exponential speedup.

  • naasking 2 years ago

    You would need an exponential number of coupled oscillators to achieve that exponential speedup. Doesn't seem too good to be true to me.

  • somat 2 years ago

    I am probably misunderstanding something fundamental. But isn't that exactly what a qubit is. A coupled harmonic oscillator.

inasio 2 years ago

This could be pretty exciting. For all the talk about quantum advantage, the number of quantum algorithms that have exponential speedup is super small; as the blog mentions it's essentially Shor's prime factoring and quantum simulations.

fgoesbrrr 2 years ago

> Further, we use this mapping to prove that any problem efficiently solvable by a quantum algorithm can be recast as a problem involving a network of coupled oscillators, albeit exponentially many of them.

Is this a new result, giving that quantum field theory is described in terms of quantum harmonic oscillators?

  • eigenket 2 years ago

    Its a network of exponentially many coupled classical oscillators, not quantum harmonic oscillators. This is a new result.

11101010001100 2 years ago

Need to take a closer look at the paper, but a classical approach to this problems is to map it to an eigenvalue problem, so could we say that they have found a quantum speed up for solving eigenvalue problems?

shollos 2 years ago

I suspect an analog computer would work just as well for modeling coupled harmonic oscillators.

  • eigenket 2 years ago

    That would be very surprising. They show in this work that their problem is complete for the complexity class BQP. That means that if you can solve it on a classical computer (analog or not) in polynomial time you get (for free) classical polynomial-time algorithms for solving a bunch of problems we don't currently have classical poly-time algorithms for.

    Most surpisingly this would include the hidden subgroup problem and hence give you a classical poly-time algorithm for integer factorization.

xinayder 2 years ago

Is this similar to Ed Gerck's "finding" of a QC algorithm that can break RSA-2048?

swiftlyTyped 2 years ago

Usually these types of articles are total nonsense, but this is legit.

A very cool result!

It'd be interesting to see how many other systems can be approximated by the system they've solved for (without incurring an exponential penalty in the translation).

latenightcoding 2 years ago

I'll read it when I'm home. But I want to say that the fact that this is from google "quantum AI" makes me doubt the legitimacy. They are really ruining their reputation with all the absurd quantum stuff they have been publishing, e.g: their wormhole stuff and a lot of quantum neural networks bs.

  • moab 2 years ago

    It would be a good idea to read the blog post before making comments.

    The result looks very interesting, and the blog post is well written (e.g., I did not know about the prior work re. Grover's algorithm and pendulum systems).

    The blog post is also based on a recent FOCS paper, and the authors are reputable people in CS theory, if that convinces anyone to take a closer look.

  • ko27 2 years ago

    Stop basing your opinions on titles alone. Most of their blog posts on AI and quantum are exceptionally well written and researched, even the one you referenced: "Making a Dual of a Traversable Wormhole with a Quantum Computer" [1]

    It's like saying quantum teleportation [2] is BS, just because you don't like the SF sounding word "teleportation".

    https://blog.research.google/2022/11/making-traversable-worm...

    https://en.wikipedia.org/wiki/Quantum_teleportation

  • eigenket 2 years ago

    I sort of agree. Quantum computing is interesting, AI is interesting, "quantum AI" is a meme and almost certainly not interesting. On the other hand the Google team do do a lot of cool research, and I assume the scientists weren't consulted by the MBA marketing idiots who came up with the name.

    The "wormhole" thing was scientifically interesting as a quantum simulation of a non-trivial gravitational thing. A lot of the media stuff was garbage, but the actual science they did was quite cool.

  • affgrff2 2 years ago

    Anyone have an interesting link to some quantum neural network bs? Sounds interesting...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection