Irreversibility and Heat Generation in the Computing Process (1961) [pdf]
pitt.eduThere's a plethora of air-gap malware studies from Ben-Gurion university [1], abused channels range from thermal, acoustic, optical, to classic TEMPEST.
The interesting thing is that there's no escape from such covert attacks, since machines are bound by the laws of physics that demand energy dissipation and generation of noise, heat, etc. If it leaks, then it computes.
Shouldn't that be the other way around? Plenty of things leak heat (like cooking dinner) that don't do any computation?
The quantum of action, which a layperson can consider as the smallest possible state change, is actually Planck's constant.
https://en.wikipedia.org/wiki/Planck_constant
All energy refers to is an amount of state change that is occurring every second that some quantity of energy exists. That's why action has units Energy x Time. You can divide by time and you get Energy = Actions per Second, kinda like APM in Starcraft..
https://en.wikipedia.org/wiki/Action_(physics)
Principle of least action is really the principle of "least change."
In summary, computation is happening -- but most of the computation that energy does while held as mass/matter is cyclic processes (aging.) Not anything interesting, at least to me or you. Unless you like https://en.wikipedia.org/wiki/Radiometric_dating
Maybe. Or maybe they do but we don't know it ;)
An interesting paper in somewhat the opposite direction, but still building on Landauer's work is "Ultimate physical limits to computation":
https://arxiv.org/abs/quant-ph/9908043
The end of the abstract:
> [...] quantitative bounds are put to the computational power of an `ultimate laptop' with a mass of one kilogram confined to a volume of one liter.
That along with "on the computational capacity of the universe" was the paper that stopped me from ever taking Seth Loyd seriously again. Later finding out he was a creepy Epstein croney... well sometimes the universe makes a lot of sense.
So, here's a question:
If heat dissipation is so fundamental to computation, why aren't we using the amount of required heat dissipation as the fundamental measure of complexity in quantum computing?
In particular, say I construct a box with a uniform mixture of all quantum states on n qubits, and all the unitaries that can operate on those n qubits. how much heat do I have to dissipate in order to refine the state in my box to a "particular" quantum state and a "particular" unitary? There are some interesting, and recent, results about how quickly I can dissipate heat. In particular, T(t) is bounded from below by k*T(0)/t^7, where k is some constant.
Since the number of quantum computers you could want to build in that box is growing very fast with the size of your input, I suspect that your inability to throw away states in the process of constructing your computer very rapidly becomes the dominant effect in how long it takes to go from "which number do I want to factor" to actually getting the factors of that number.
One of the things they don't tell you about quantum computing is it's supposed to be reversible. Which, ceteris paribus, probably means they're not physical. Nobody likes to talk about that as it's terrible for funding.
You can build totally reversible computers using ordinary classical physics which ... in principle can be arranged to dissipate no heat (in practice they'll always dissipate heat). The problem is you're basically effectively dissipating the "heat" into a memory system which rapidly becomes practically infinite [1]. Imagine keeping around all the bits that got AND-gated away from .... I dunno, fitting GPT-3. Or even just inverting some big matrix. That's what you got to do for reversible computing: at the individual bit level mind you -many of the fundamental floating point operations are not themselves reversible, so they throw off more "heat" aka fill up memory cells with bits which allow you to reverse them.
Landauer, who is an underappreciated genius, wasn't aware of the reversible computing idea, or had too much sense to bother with it. There are others who attempt to defeat his very common sense idea with hand wavey adiabatic relaxation ideas, but I think they're all baloney. All of this is a barrel of monkeys to think about (not QC, which is dumb; the general reversible computing stuff); I recommend the seminal papers listed in the below wiki article if you have an afternoon to burn. Bennett, Toffoli and Vitanyi in particular are real fun to read.
These are interesting reads, and it seems that reversible computers suffer the same problem, physically. If I want to actually make a reversible computer, I still have to make the blank tape that it uses.
Precisely: and that issue (among many others) is the one ding dongs in "quantum information" sweep under the carpets and hope you don't notice. It's HUGELY OBVIOUS if you try to do it for an HP42 calculator tier computer. Somehow the mystification of adding entanglement to the mess pushes it off into "Hilbert space" and nobody notices.
As a practicing physicist in the field, I think I'm qualified to reply. The question is a valid one, and I've heard it a few times before from other researchers.
In brief, the irreversible heat dissipation, calculated from the change in the entropy (S) of some sub-system of the quantum computer, does not yield a quantity that has much practical relevance. But it does lead to interesting considerations, some of which I've written out below.
Following the basic thinking behind Landauer's principle, if one considers the state of the qubits themselves, the dissipation is always zero or even negative. In the usual universal gate-based model, any computation starts with a known pure quantum state for which S = 0. The gates are unitary transformations and do not change the entropy. Non-coherent interactions generally lead to a state with nonzero positive entropy. Amusingly, this could be interpeted as using the quantum register to cool its surroundings by a tiny amount, at the cost of randomizing the output of the "computation".
A more fruitful approach (which you also alluded to) relates to the "cost" of the unitary transformations themselves. Implementing high-fielity gates, necessary for useful quantum computing, requires very precise time-varying external control fields. It is correct to think that there is a thermodynamic cost to ensuring that the noise in the fields experienced by the qubits are small.
For a concrete example, consider the fact that superconducting qubits are controlled with microwave signals. The output of a room-temperature microwave source has thermal noise supreimposed with the desired, sythesized signal. For simplicity, we can take the noise temperature to be 300K. (It is much higher in practice.) To use this output to drive a superconducting qubit at T = 30 mK, the power needs to be attenuated by at least a factor of 10^4. (Again, the real factor is much higher.) Hence, a lot of power is seemingly wasted in the synthesis of the control signals. I think a similar argument can be made about the lasers used to control some other kinds of qubits.
This type of dissipation is relevant for hypothesized large-scale quantum computers. It doesn't lead to new deep "quantum" insights, however. First, the dissipation is independent of the state of the qubits, and simply scales linearly with the number of operations. Second, the dissipation takes place outside of the delicate qubit system. Removing any amount of entropy from (i.e., cooling) the environment surrounding the qubits is merely a difficult classical engineering problem.
A lighter, non-quantitative starting point https://en.wikipedia.org/wiki/Reversible_computing
I thought of this when I learned about Rust 'moving' values could--in reversible computing hardware--have a zero thermodynamic lower bound.
Sorry for offtopicness: submitter, could you please email hn@ycombinator.com? I would like to send you a repost invite for another submission.