IBM Scientists Measure the Heat Emitted From Erasing a Single Bit 111
ananyo writes "In 1961, IBM physicist Rolf Landauer argued that to reset one bit of information — say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0 — must release a certain minimum amount of heat, proportional to the ambient temperature. New work has now finally confirmed that Landauer was right. To test the principle, the researchers created a simple two-state bit: a single microscopic silica bead held in a 'light trap' by a laser beam. (Abstract) The trap contains two 'valleys' where the particle can rest, one representing a 1 and the other a 0. It could jump between the two if the energy 'hill' separating them is not too high. The researchers could control this height by changing the power of the laser, and could 'tilt' the two valleys to tip the bead into one of them by moving the physical cell containing the bead slightly out of the laser's focus. By monitoring the position and speed of the particle during a cycle of switching and resetting the bit, they could calculate how much energy was dissipated."
Spirit (Score:1)
Was that really the spirit of what Landauer was considering?
Why not measure the computer memory such as he envisioned?
-AI
Re:Spirit (Score:5, Interesting)
Yeah, it's kind of like a piece of armor being considered arrow-proof, and then you fire an arrow out of a railgun.
I wonder how you would even measure it, though, and distinguish the heat from a bit changing from the ambient heat from drive operation.
Re:Spirit (Score:5, Insightful)
It probably reflects the spirit of Landauer's claims. Claims such as this depend upon an understanding of physics, which was much more common in computing back in the days when innovation depended upon an understanding of physics in order to develop new hardware. You also have to consider that a variety of different techniques were used to make computer memories back then, so his claims had to be based upon the underlying physics rather than a particular memory technology. So it is fair game to apply different physical models to prove his claims.
Re: (Score:2)
I can appreciate that. But I question the actual relevance of the results, given that the "memory technology" used doesn't resemble anything I've ever heard of being used in a production computer in 30+ years.
The fact that energy would be needed to force a state change should have been intuitively obvious to anyone with even a Grade 12 physics education.
Re: (Score:3)
I've seen serious claims that "reversable computation" can be done with no energy input at all. What this doesn't cover, of course, is setting up the initial conditions, or extracting the results of the computation. One requirement is that at the end of the computation, the state of the system should be identical to the initial state.
I must admit that I don't understand either the utility, or the feasibility, of such a system. But there have been serious claims that computation does not, itself, require
Re: (Score:2)
I must admit that I don't understand either the utility, or the feasibility, of such a system.
Wikipedia gives an answer [wikipedia.org]:
"Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.
Probably the largest motivation for the study of technologies aimed at actually implementing reversible com
Re: (Score:2)
The thing is, reversible computation doesn't appear to allow the answer to be extracted. So it's not clear what use it is. And it hasn't been actually built, so I'm not convinced that it's feasible.
Re: (Score:2)
No kidding? You DO WORK and ENERGY IS RELEASED? Is anybody surprised to see that Landauer was right? Nobody?
What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.
Science publishing is not what it used to be.
Re: (Score:2)
You are absolutely right. And that's why we have modern technology and, in fact, physics themselves: because people began verifying obvious "facts".
Re:Spirit (Score:5, Insightful)
Without precisely controlling the change in information and precisely measuring the change in heat, the result is much less clear. That's why they used this methodology and equipment. Moreover, as this is empirical evidence for a very general identity between heat and information, the result will hold for computer memory as well.
Re: (Score:2)
In that case, I could have used a mechanical switch to represent 0 and 1 and told you that heat was dissapated. There needs to be a little more to draw a parallel between a random experiment and computer memory.
Re: (Score:2)
Re: (Score:2)
Infophysics will be the new physics (Score:2)
I'm going out on a limb here, not having had the time to study this stuff enough,
but my intuition says that the unification of information theory and physics will yield a great breakthrough in physics.
I take the view that thermodynamics and Shannon information theory are literally about the same thing exactly, not just by weak analogy.
Related factoids:
1. All information is embodied mutual information.
a. It must be embodied in some local configuration of matter/energy.
b. It must be mutual in that the informa
Re: (Score:2)
>>entropy in information theory is identical to the entropy in thermodynamics
Is there a name for this law?
Also, what does this say about the reality of information itself?
Re:Spirit (Score:5, Insightful)
It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.
Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.
Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org], which is somewhat related, and pretty cool.
Re: (Score:2)
Fantastic. Thanks for writing this up.
Re: (Score:2)
Excellent response, thanks.
Pretty far afield followup question: every time Work is performed, Entropy increases. Using the Landauer Principle, it seems like you could you consider information processing to be a sort of Work being done, leading to a similar increase in entropy. If our conscious minds are a form of information processing engine, could consciousness be a byproduct of the Work being conducted by the information processing, which manifests itself simply as extra heat being radiated by the system
Re: (Score:2)
It's also beatable if you find a way to store information non-physically.
I think this is what throws everyone when they think about the physics of knowledge. The vast majority of people don't realize that the physical embodiment of information must obey the laws of physics, and even many who do seem to believe knowledge ought to have some form of "soul" not shackled by physical constraints.
Re: (Score:2)
It is in the spirit of what Landauer was considering. The larger question is if information entropy and thermodynamic entropy are related.
What am I missing? (Score:2)
To store information, you need the ability to set something into at least two possible states, one of which can be the intrinsic state. No matter what you use for storage, you'll always need energy to reach the non-intrinsic state(s), since the intrinsic state is, essentially by definition, the state achieved with no external energy applied.
If you must add energy to enter a non-intrinsic state, it makes perfect sense that the energy would need to be dissipated to return to the intrinsic state (which equates
Re: (Score:3, Informative)
Say you have two valleys named 0 and 1, and a mountain between. Setting our bit by rolling a ball from 0 to 1 would require energy expenditure, but once the ball is in the valley it is stable and won't roll out again without further input. 0 and 1 may be at different heights relative to each other, but need not be. They might even be at the same altitude. But if 1 were higher than 0, then yes, you would be storing energy in some sort of potential energy form, and may be able to recover that energy when
Re:What am I missing? (Score:5, Informative)
It's theoretically possible to change the state of a bit without spending energy. Here's a dumb example: think of a closed system (so no energy is being gained or lost) consisting of a box filled with oxygen and only one molecule of water. Divide the box in two halves and say a bit is "0" if the molecule of water is in the left half and "1" if it's in the right half. If you wait a while, eventually the bit will flip with absolutely no change in energy. That's a dumb example, but it shows that there's nothing that requires a "intrinsic state" and energy loss when you move away from it, like you described.
The only time energy dissipation is unavoidable (in theory) is when you erase information. That's a strange concept because, usually, we don't think about "conservation of information" in the same sense of conservation of energy, but there's a relation [wikipedia.org]. A little more discussion with more relevance to computing can be found here: http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org].
Re: (Score:2)
Re: (Score:2)
Yes, because of the zero point energy, since we're using a molecule. The bond has a minimum vibrational energy of 1/2 h*nu when the vibrational quantum number is 0 (ground state), so even when the temperature is 0 K, the bond still has energy and the molecule will still move around.
Re: (Score:2)
I was going to post something about reversible computing. I found it an interesting concept when I read that Richard Feynman did some work in computation and was a proponent of it. As far as I can tell, the idea was largely ignored.
I think reversible computing would not only be more energy efficient, but from what I understand might make for some interesting debugging, because I think you could run the program counter backward to an error.
Re: (Score:2)
Changing the state of a bit is not necessarily the same as storing information. To be used for information storage, the system can only move between valid states through external stimuli. If it changes to a different state without external stimuli, then it either doesn't store information or the states are not defined correctly.
The whole point of storing something is to have it maintain its state. If an item is not maintaining a single state, then it's not storing information. And if the item is maintaining
Re: (Score:2)
You're thinking in terms of a storing information the way a normal (irreversible) computer does. Not all computation must be done that way, I was describing a specific way that's not like that. Imagine that in the system of my (dumb, as I said) example the problem being calculated was, conveniently, the equivalent of "in which side of the box the water molecule will be after 3 days". In this case, I have to spend no energy at all to compute that, assuming the box is perfectly isolated from the environment.
A
Re: (Score:2)
Since we're discussing information storage rather than calculations (certainly the two are related but not the same), then per your example the information storage act would require energy to place the water molecule into the box in the first place. If you ignore that by assuming the molecule is already there, then you haven't stored anything and are simply in the intrinsic state of the box like I discussed originally. A computation with no controlled inputs yields no information, it's just nature running i
Re: (Score:2)
Perhaps you are thinking of this in a purely theoretical sense. In that case then yes, if you can harvest 100% of the energy stored when changing a value, then no additional energy is required.
That's the whole point of IBM's experiment and Landauer's principle: even in a purely theoretical sense, if you erase information when you're changing the state of a bit, you necessarily spend a minimum amount of energy. You can't, even theoretically, harvest 100% of the energy back. I was showing that there are other useful ways to change the state of a bit (e.g. in reversible computing) that do not incur in this purely theoretical energy cost, where you could theoretically harvest 100% of the energy back
Re: (Score:1)
Re: (Score:2)
You don't need to keep checking whether the bit has flipped. In fact, look at the most "quantum world" example possible: the usual way to define a quantum computer uses reversible computing (because quantum logic gates are reversible [wikipedia.org]).
Re: (Score:1)
Re: (Score:2)
Sure. But the whole point is that the *computation* itself doesn't need to spend energy[1]. The only time you have to spend energy, in principle, is when preparing the computation and when measuring -- the computation itself could run for years and no energy would be necessary (if the system is sufficiently isolated from the environment). In contrast, with "normal" irreversible computation, every time you irreversible flip a bit (e.g., when you apply an AND gate) you must spend energy.
[1] in the "quantum wo
This is just entropy, right? (Score:5, Informative)
Re: (Score:1)
It is confirmation that scientists and anyone working to promote information are enemies of the universe. Processing information brings forward the end of the universe.
Re:This is just entropy, right? (Score:5, Funny)
Quick, you'd better stop thinking to slow the process down.
Re: (Score:2)
Way ahead of you. Stopped this sometime ago. Also took out a couple of scientists - and burnt them, so that will help.
(note contents of message may contain unexamined irony)
Re: (Score:2)
too many people of this planet believe the sentences are true.
FTFY. I can't see how thinking has anything to do with it at all. Too many people simply refuse to do it.
Re: (Score:1)
too many people of this planet believe the sentences are true.
FTFY. I can't see how thinking has anything to do with it at all. Too many people simply refuse to do it.
And hence they are saving the universe ... Must try to remember this when next debating with the unreasoning - they are doing it for all of us...
Re:Yes, it's the entropy (Score:5, Interesting)
Specifically in the calculation of the Landauer limit, E = kT(ln2), the minimal energy needed to transform a single bit. The interesting thing is that 10^20 bit operations is just a watt. This means that the efficiency of today's computers is just 0.00001%. More details at http://tikalon.com/blog/blog.php?article=2011/Landauer.
Re:Yes, it's the entropy (Score:4, Insightful)
Not really that surprising, a silicon atom is about 0.11nm and the lattice grid in a silicon crystal 0.54nm, which is still way smaller than the 32nm processors he's talking about. I don't know how many electrons flow down each 32nm path but they're between 0.1nm and 0.000006nm in diameter depending on what model you use - quantum mechanics makes a mess of this anyway - so it's way more than one. If you want single electron calculations you'll have single electron signals, one quantum event and your signal is lost. So the limit is likely to remain a very theoretical limit.
The other thing is that this only includes the operation itself, no clock cycle, no instruction pointer, no caching, prefetching, branching, this is the ideal you could get out of a fixed-function ASIC that only does one thing, not even as programmable as a GPU shader. We already know that there's a significant gain to that, but even supercomputers aren't built that specifically to the task. Formulas must be tweaked, models adjusted, parts must be able to be used in many computers. We've already seen that a GPGPU can beat a CPU by far on some tasks, but even they aren't close to such an ideal.
If you think about this in encryption terms it's not that much... it says you can at most improve 23-24 bits, in encryption most have used the Landauer limit to "prove" there's not enough energy to break a 256 bit chipher by brute force. In some places I don't think it's that relevant either, in for example mobile I think the energy involved in bandwidth use will be more significant. Want to stream a HD movie? It's not the decoding that kills the battery, it's the 3/4G data connection. Just like cameras get better but good optics still isn't small, light or cheap.
Re: (Score:1)
It isn't working (Score:1)
I'm trying to post some whitespace to decrease the temperature in here, but the lameness filter keeps getting in the way!
Re: (Score:1)
Are you sure you weren't trying to post some lameness, but the whitespace filter kept getting in your way?
Re: (Score:2)
What? How will we decrease global temperatures by making prime-time television more steamy?
What if I store bits as heat? (Score:1, Interesting)
Let 0s be room temperature and let 1s be somewhat below room temperature. Then to erase the memory I expose it to the room. As it erases the memory will absorb some heat from the room instead of releasing heat.
Not really a practical form of computer memory, but seems sufficient to disprove Landauer.
Re: (Score:1)
Interesting.
That the mechanism you select absorbs heat does not mean that some heat was not released. Possibly a small amout of heat was released by the state change whilst the mechanism also absorbed heat. It is not the net effect (overall heat absorbed) that is the critical point here.
Not convinced that the thought experiment disproves Landaue (but very interesting - thank you AC), but I am not an expert in this area - very interested if someone with deeper specific knowledge could enlighten?
Re: (Score:3)
Yes, the memory will absorb heat, but it costs heat from the hot room. You have to consider the total energy of a closed system and it your naïve approach, the best you can get is a net neutral energy balance. The argument is primarily about the fundamental increase of entropy associated with erasing a bit, and thermal equilibration (between a hot and a cold object) definitely represents an increase in entropy.
Re:What if I store bits as heat? (Score:4, Insightful)
Your example only erases in one direction.
To be correct, your experiment must complete erasure in both directions (0 to 1, 1 to 0).
As such, I think you'll find going the other direction is radically more difficult to get energy neutral since you'll be trying to keep thermal bleed from happening whilst flipping your bit.
Re: (Score:3)
Ut oh's (Score:2)
This gets to the core of the subject (Score:1)
Re: (Score:2)
So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.
You're misunderstanding the statement. Actually flipping the bit releases heat. Doing the work required to flip the bit also involves the generation of heat, but that heat isn't flipping the bit, and therefore it's not CONSUMED in the process of flipping the bit, just WASTED.
Re: (Score:1)
it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.
What? Scientific models of how our Sun work exist, and as do measurements of heat from it. They are big. I strongly doubt even any old/inefficient human made computing system has yet got anywhere near our Sun.
Could you be writing from some amazingly dangerous alternative universe with a massive energy cost on information? Or am I the victim of a sophisticated troll employing meaningless hyperbole?
answer (Score:1)
A little bit of heat
Slow erasure? (Score:3)
Does this suggest that by saving up erasures to be done more slowly, perhaps by flipping bits to 0 near the time when they are flipped to 1, could energy be saved and the Landauer limit approached? Also, are there architectures in which a flipping a bit in one direction uses less power, or when blocks of bits can be deselected by some pointer instead of actually erased, trading memory hardware space for power usage?
Re:Slow erasure? (Score:4, Informative)
Unpossible! (Score:3)
By monitoring the position [AND] speed of the particle...
Unpossible! Measure one or the other, but not both...
Re: (Score:2)
>>Unpossible! Measure one or the other, but not both...
Well, kinda.
Hot smoke? (Score:1)
"IBM Scientists Measure the Heat Emitted From Erasing a Single Bit"
All of this seems like a bunch of hot smoke to me. Can't these scientists find something better to do with their pay?
Re: (Score:2)
Finally! Some evidence that Dark Energy = entropy (Score:2)
There are rational objections to this proposal. Landauer's principle is really an expression of entropy in information systems -- which can be mathematically modeled as though they were thermodynamic systems. It's a bold claim to sa
designing circuits around this theory? (Score:2)
Back in the Uni library, I once had an old ('60's?) book in my hands which stated that for every logical AND circuit, combining two '1' bits would also result in heat. The author suggested designing AND circuits so taht they would have two results: the logical outcome, and the overflow 'exhaust', both connected to the rest of the circuitry. This would be used to keep the processor from generating heat, but might also have more practical, logical uses. (He probably said similar things for other kinds of circ
Boil the oceans (Score:2)
I remember reading in Bunnie Huang's book on Hacking the Xbox that a computer just enumerating 2**256 (let alone doing anything useful) would require enough power to boil the oceans.
Maybe it wasn't 256, but it was related to cryptography.
Quantum ramifications... (Score:1)
I thought the energy to flip a bit was already measured in Quantum Computing devices as it tends to cause de-coherence?
If not, then it should :-)
Re:one thing we know for sure (Score:4, Funny)
Wonder how much heat is dissipated when you mod a post down?
Re: (Score:1)
Re: (Score:2)
Re: (Score:3)
Wonder how much heat is dissipated when you mod a post down?
Less than the heat that is saved by not displaying the down-modded post in millions of basements all over the world.
Re:What a very very stupid test (Score:5, Informative)
Re: (Score:2, Insightful)
its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.
Now,
Re: (Score:1)
its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.
Now, let X be some sort of encryption complexity. now do u see how it could be useful?
All the more reason to buy a five dollar wrench [xkcd.com].
Re: (Score:2)
Or, in reverse:
Cool down the disk to a point where you can measure the temperature changes really well. Now start the encryption. How much information does the change in temperature of the disk (or SSD, or RAM) give you? Could be interesting.
Re: (Score:2)
Which require you to put strictly more energy to prepare reagents for the reaction than would be consumed by the reaction.
Re: (Score:2, Insightful)
Oh it has a law on Wikipedia, must be a waste of time to test or verify it then! Seriously, have a read about how science works before attempting to comment again. A "law" in science is not like a legal law - i.e. it is not a fact merely by self-assertion (a legal law is a law because law makers say so). Scientific "laws" require test and proof; they often require refinement in details. Scientific "laws" do not exist as abstract facts about the universe - they are human attempts to model the universe fr
Re: (Score:2)
I'm not attempting to challenge the "laws" of thermodynamics - my guess would be that we have the broad picture right (we have a lot of evidence in favour), but again, given the history of science I would be surprised if every detail of taught theory in that area survives the next few hundred years without some modification.
Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.
Re: (Score:2)
Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.
Science is all about making predictions, and not about discovering how anything works (formally, anyhow). Or as a physics professor put it: "There are no particles, only clicks in my Geiger counter".
Re: (Score:2)
Re: (Score:2)
You are thinking of scientific theories or hypotheses. Scientific laws are based on observations, but they are not proven. In fact, they are the assumptions and axioms upon which proofs are built.
Re: (Score:2)
Re: (Score:2)