Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IBM Science

IBM Scientists Measure the Heat Emitted From Erasing a Single Bit 111

ananyo writes "In 1961, IBM physicist Rolf Landauer argued that to reset one bit of information — say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0 — must release a certain minimum amount of heat, proportional to the ambient temperature. New work has now finally confirmed that Landauer was right. To test the principle, the researchers created a simple two-state bit: a single microscopic silica bead held in a 'light trap' by a laser beam. (Abstract) The trap contains two 'valleys' where the particle can rest, one representing a 1 and the other a 0. It could jump between the two if the energy 'hill' separating them is not too high. The researchers could control this height by changing the power of the laser, and could 'tilt' the two valleys to tip the bead into one of them by moving the physical cell containing the bead slightly out of the laser's focus. By monitoring the position and speed of the particle during a cycle of switching and resetting the bit, they could calculate how much energy was dissipated."
This discussion has been archived. No new comments can be posted.

IBM Scientists Measure the Heat Emitted From Erasing a Single Bit

Comments Filter:
  • Was that really the spirit of what Landauer was considering?
    Why not measure the computer memory such as he envisioned?

    -AI

    • Re:Spirit (Score:5, Interesting)

      by Ihmhi ( 1206036 ) <i_have_mental_health_issues@yahoo.com> on Sunday March 11, 2012 @04:37AM (#39316785)

      Yeah, it's kind of like a piece of armor being considered arrow-proof, and then you fire an arrow out of a railgun.

      I wonder how you would even measure it, though, and distinguish the heat from a bit changing from the ambient heat from drive operation.

    • Re:Spirit (Score:5, Insightful)

      by MacTO ( 1161105 ) on Sunday March 11, 2012 @04:45AM (#39316807)

      It probably reflects the spirit of Landauer's claims. Claims such as this depend upon an understanding of physics, which was much more common in computing back in the days when innovation depended upon an understanding of physics in order to develop new hardware. You also have to consider that a variety of different techniques were used to make computer memories back then, so his claims had to be based upon the underlying physics rather than a particular memory technology. So it is fair game to apply different physical models to prove his claims.

      • by msobkow ( 48369 )

        I can appreciate that. But I question the actual relevance of the results, given that the "memory technology" used doesn't resemble anything I've ever heard of being used in a production computer in 30+ years.

        The fact that energy would be needed to force a state change should have been intuitively obvious to anyone with even a Grade 12 physics education.

        • by HiThere ( 15173 )

          I've seen serious claims that "reversable computation" can be done with no energy input at all. What this doesn't cover, of course, is setting up the initial conditions, or extracting the results of the computation. One requirement is that at the end of the computation, the state of the system should be identical to the initial state.

          I must admit that I don't understand either the utility, or the feasibility, of such a system. But there have been serious claims that computation does not, itself, require

          • by Raenex ( 947668 )

            I must admit that I don't understand either the utility, or the feasibility, of such a system.

            Wikipedia gives an answer [wikipedia.org]:

            "Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.

            Probably the largest motivation for the study of technologies aimed at actually implementing reversible com

            • by HiThere ( 15173 )

              The thing is, reversible computation doesn't appear to allow the answer to be extracted. So it's not clear what use it is. And it hasn't been actually built, so I'm not convinced that it's feasible.

      • No kidding? You DO WORK and ENERGY IS RELEASED? Is anybody surprised to see that Landauer was right? Nobody?

        What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

        Science publishing is not what it used to be.

        • What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

          Science publishing is not what it used to be.

          You are absolutely right. And that's why we have modern technology and, in fact, physics themselves: because people began verifying obvious "facts".

    • Re:Spirit (Score:5, Insightful)

      by jpate ( 1356395 ) on Sunday March 11, 2012 @10:58AM (#39317857) Homepage
      Landauer's claim was about the relationship between entropy as used in information theory and entropy as used in thermodynamics: specifically, that entropy in information theory is identical to the entropy in thermodynamics. The scientists used this set-up so they could measure a change of exactly one bit (the information-theoretic conception of entropy) while controlling outside heat influences (the thermodynamics conception of entropy), and see if the change in information corresponded to the change in heat as predicted by thermodynamics and information theory.

      Without precisely controlling the change in information and precisely measuring the change in heat, the result is much less clear. That's why they used this methodology and equipment. Moreover, as this is empirical evidence for a very general identity between heat and information, the result will hold for computer memory as well.
      • In that case, I could have used a mechanical switch to represent 0 and 1 and told you that heat was dissapated. There needs to be a little more to draw a parallel between a random experiment and computer memory.

        • by jpate ( 1356395 )
          Computer memory is a bunch of mechanical switches. The point is that they have a lot of sources of heat aside from reductions in the information content of the physical system. The researchers built a switch that was as efficient as possible so the vast majority of heat dissipation could be attributed to changes in the information content of the switch. Real computer memory will have heat dissipation due to changes in information content along with heat dissipation from such things as moving read/write head
          • by jpate ( 1356395 )
            additionally, the point isn't just that heat was dissipated, but rather that a specific quantity was dissipated as predicted by thermodynamics and information theory.
            • I'm going out on a limb here, not having had the time to study this stuff enough,
              but my intuition says that the unification of information theory and physics will yield a great breakthrough in physics.

              I take the view that thermodynamics and Shannon information theory are literally about the same thing exactly, not just by weak analogy.

              Related factoids:
              1. All information is embodied mutual information.
              a. It must be embodied in some local configuration of matter/energy.
              b. It must be mutual in that the informa

      • >>entropy in information theory is identical to the entropy in thermodynamics

        Is there a name for this law?

        Also, what does this say about the reality of information itself?

        • Re:Spirit (Score:5, Insightful)

          by canajin56 ( 660655 ) on Sunday March 11, 2012 @03:03PM (#39319073)

          It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.

          Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.

          Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org], which is somewhat related, and pretty cool.

          • Fantastic. Thanks for writing this up.

          • Excellent response, thanks.

            Pretty far afield followup question: every time Work is performed, Entropy increases. Using the Landauer Principle, it seems like you could you consider information processing to be a sort of Work being done, leading to a similar increase in entropy. If our conscious minds are a form of information processing engine, could consciousness be a byproduct of the Work being conducted by the information processing, which manifests itself simply as extra heat being radiated by the system

          • by Qzukk ( 229616 )

            It's also beatable if you find a way to store information non-physically.

            I think this is what throws everyone when they think about the physics of knowledge. The vast majority of people don't realize that the physical embodiment of information must obey the laws of physics, and even many who do seem to believe knowledge ought to have some form of "soul" not shackled by physical constraints.

    • by blueg3 ( 192743 )

      It is in the spirit of what Landauer was considering. The larger question is if information entropy and thermodynamic entropy are related.

  • To store information, you need the ability to set something into at least two possible states, one of which can be the intrinsic state. No matter what you use for storage, you'll always need energy to reach the non-intrinsic state(s), since the intrinsic state is, essentially by definition, the state achieved with no external energy applied.

    If you must add energy to enter a non-intrinsic state, it makes perfect sense that the energy would need to be dissipated to return to the intrinsic state (which equates

    • Re: (Score:3, Informative)

      by epte ( 949662 )

      Say you have two valleys named 0 and 1, and a mountain between. Setting our bit by rolling a ball from 0 to 1 would require energy expenditure, but once the ball is in the valley it is stable and won't roll out again without further input. 0 and 1 may be at different heights relative to each other, but need not be. They might even be at the same altitude. But if 1 were higher than 0, then yes, you would be storing energy in some sort of potential energy form, and may be able to recover that energy when

    • by FrangoAssado ( 561740 ) on Sunday March 11, 2012 @09:13AM (#39317453)

      It's theoretically possible to change the state of a bit without spending energy. Here's a dumb example: think of a closed system (so no energy is being gained or lost) consisting of a box filled with oxygen and only one molecule of water. Divide the box in two halves and say a bit is "0" if the molecule of water is in the left half and "1" if it's in the right half. If you wait a while, eventually the bit will flip with absolutely no change in energy. That's a dumb example, but it shows that there's nothing that requires a "intrinsic state" and energy loss when you move away from it, like you described.

      The only time energy dissipation is unavoidable (in theory) is when you erase information. That's a strange concept because, usually, we don't think about "conservation of information" in the same sense of conservation of energy, but there's a relation [wikipedia.org]. A little more discussion with more relevance to computing can be found here: http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org].

      • Would this be true at absolute zero? No? Then probably the system is using some heat.
        • by jo_ham ( 604554 )

          Yes, because of the zero point energy, since we're using a molecule. The bond has a minimum vibrational energy of 1/2 h*nu when the vibrational quantum number is 0 (ground state), so even when the temperature is 0 K, the bond still has energy and the molecule will still move around.

      • by Polo ( 30659 ) *

        I was going to post something about reversible computing. I found it an interesting concept when I read that Richard Feynman did some work in computation and was a proponent of it. As far as I can tell, the idea was largely ignored.

        I think reversible computing would not only be more energy efficient, but from what I understand might make for some interesting debugging, because I think you could run the program counter backward to an error.

      • Changing the state of a bit is not necessarily the same as storing information. To be used for information storage, the system can only move between valid states through external stimuli. If it changes to a different state without external stimuli, then it either doesn't store information or the states are not defined correctly.

        The whole point of storing something is to have it maintain its state. If an item is not maintaining a single state, then it's not storing information. And if the item is maintaining

        • You're thinking in terms of a storing information the way a normal (irreversible) computer does. Not all computation must be done that way, I was describing a specific way that's not like that. Imagine that in the system of my (dumb, as I said) example the problem being calculated was, conveniently, the equivalent of "in which side of the box the water molecule will be after 3 days". In this case, I have to spend no energy at all to compute that, assuming the box is perfectly isolated from the environment.

          A

          • Since we're discussing information storage rather than calculations (certainly the two are related but not the same), then per your example the information storage act would require energy to place the water molecule into the box in the first place. If you ignore that by assuming the molecule is already there, then you haven't stored anything and are simply in the intrinsic state of the box like I discussed originally. A computation with no controlled inputs yields no information, it's just nature running i

            • Perhaps you are thinking of this in a purely theoretical sense. In that case then yes, if you can harvest 100% of the energy stored when changing a value, then no additional energy is required.

              That's the whole point of IBM's experiment and Landauer's principle: even in a purely theoretical sense, if you erase information when you're changing the state of a bit, you necessarily spend a minimum amount of energy. You can't, even theoretically, harvest 100% of the energy back. I was showing that there are other useful ways to change the state of a bit (e.g. in reversible computing) that do not incur in this purely theoretical energy cost, where you could theoretically harvest 100% of the energy back

      • What you are missing is that in quantum world one would need to keep checking whether the bit has flipped. That is where the energy would go into.
        • You don't need to keep checking whether the bit has flipped. In fact, look at the most "quantum world" example possible: the usual way to define a quantum computer uses reversible computing (because quantum logic gates are reversible [wikipedia.org]).

          • Thank you for the reference. In a quantum computer the energy would be spent during the measurement step, which is required to extract information back into our digital world.
            • Sure. But the whole point is that the *computation* itself doesn't need to spend energy[1]. The only time you have to spend energy, in principle, is when preparing the computation and when measuring -- the computation itself could run for years and no energy would be necessary (if the system is sufficiently isolated from the environment). In contrast, with "normal" irreversible computation, every time you irreversible flip a bit (e.g., when you apply an AND gate) you must spend energy.

              [1] in the "quantum wo

  • by global_diffusion ( 540737 ) on Sunday March 11, 2012 @04:49AM (#39316815) Homepage
    I mean, this is demanded by Maxwell's demon, right? You need to expend energy to store information in order to not violate the 2nd law of thermodynamics. Awesome that they measured it, for sure.
    • by Anonymous Coward

      It is confirmation that scientists and anyone working to promote information are enemies of the universe. Processing information brings forward the end of the universe.

      • by MaskedSlacker ( 911878 ) on Sunday March 11, 2012 @05:11AM (#39316873)

        Quick, you'd better stop thinking to slow the process down.

        • Way ahead of you. Stopped this sometime ago. Also took out a couple of scientists - and burnt them, so that will help.

          (note contents of message may contain unexamined irony)

    • by Anonymous Coward on Sunday March 11, 2012 @05:27AM (#39316923)

      Specifically in the calculation of the Landauer limit, E = kT(ln2), the minimal energy needed to transform a single bit. The interesting thing is that 10^20 bit operations is just a watt. This means that the efficiency of today's computers is just 0.00001%. More details at http://tikalon.com/blog/blog.php?article=2011/Landauer.

      • by Kjella ( 173770 ) on Sunday March 11, 2012 @10:11AM (#39317649) Homepage

        Not really that surprising, a silicon atom is about 0.11nm and the lattice grid in a silicon crystal 0.54nm, which is still way smaller than the 32nm processors he's talking about. I don't know how many electrons flow down each 32nm path but they're between 0.1nm and 0.000006nm in diameter depending on what model you use - quantum mechanics makes a mess of this anyway - so it's way more than one. If you want single electron calculations you'll have single electron signals, one quantum event and your signal is lost. So the limit is likely to remain a very theoretical limit.

        The other thing is that this only includes the operation itself, no clock cycle, no instruction pointer, no caching, prefetching, branching, this is the ideal you could get out of a fixed-function ASIC that only does one thing, not even as programmable as a GPU shader. We already know that there's a significant gain to that, but even supercomputers aren't built that specifically to the task. Formulas must be tweaked, models adjusted, parts must be able to be used in many computers. We've already seen that a GPGPU can beat a CPU by far on some tasks, but even they aren't close to such an ideal.

        If you think about this in encryption terms it's not that much... it says you can at most improve 23-24 bits, in encryption most have used the Landauer limit to "prove" there's not enough energy to break a 256 bit chipher by brute force. In some places I don't think it's that relevant either, in for example mobile I think the energy involved in bandwidth use will be more significant. Want to stream a HD movie? It's not the decoding that kills the battery, it's the 3/4G data connection. Just like cameras get better but good optics still isn't small, light or cheap.

    • Sure, Leonard Susskind talks about computer memory and entropy on some recent Youtube video as part of a lecture on the holographic principle and the total amount of information in a system. OK, this is sort of a duh moment but I suppose it's good science to test it anyway.
  • I'm trying to post some whitespace to decrease the temperature in here, but the lameness filter keeps getting in the way!

    • by Anonymous Coward

      Are you sure you weren't trying to post some lameness, but the whitespace filter kept getting in your way?

  • by Anonymous Coward

    Let 0s be room temperature and let 1s be somewhat below room temperature. Then to erase the memory I expose it to the room. As it erases the memory will absorb some heat from the room instead of releasing heat.

    Not really a practical form of computer memory, but seems sufficient to disprove Landauer.

    • Interesting.

      That the mechanism you select absorbs heat does not mean that some heat was not released. Possibly a small amout of heat was released by the state change whilst the mechanism also absorbed heat. It is not the net effect (overall heat absorbed) that is the critical point here.

      Not convinced that the thought experiment disproves Landaue (but very interesting - thank you AC), but I am not an expert in this area - very interested if someone with deeper specific knowledge could enlighten?

    • Let 0s be room temperature and let 1s be somewhat below room temperature.

      Yes, the memory will absorb heat, but it costs heat from the hot room. You have to consider the total energy of a closed system and it your naïve approach, the best you can get is a net neutral energy balance. The argument is primarily about the fundamental increase of entropy associated with erasing a bit, and thermal equilibration (between a hot and a cold object) definitely represents an increase in entropy.

    • by Mr Thinly Sliced ( 73041 ) on Sunday March 11, 2012 @07:48AM (#39317249) Journal

      Your example only erases in one direction.

      To be correct, your experiment must complete erasure in both directions (0 to 1, 1 to 0).

      As such, I think you'll find going the other direction is radically more difficult to get energy neutral since you'll be trying to keep thermal bleed from happening whilst flipping your bit.

    • by jbengt ( 874751 )
      By your own example, it took energy to erase the bit, just that the energy came from the pre-erasure difference in temperature between the bits and the environment. And the end result of the erasure in your example is an increase in entropy for the (assumedly closed) system of the room plus the bits. So, no, your example does not come close to disproving Landauer.
  • So can we start blaming google for more global warming yet, swiching all those bits?
  • In 1961, resetting a bit involved passing a huge current through the wires surrounding a toroidal core which represented one memory bit. So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.
    • So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

      You're misunderstanding the statement. Actually flipping the bit releases heat. Doing the work required to flip the bit also involves the generation of heat, but that heat isn't flipping the bit, and therefore it's not CONSUMED in the process of flipping the bit, just WASTED.

    • it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

      What? Scientific models of how our Sun work exist, and as do measurements of heat from it. They are big. I strongly doubt even any old/inefficient human made computing system has yet got anywhere near our Sun.

      Could you be writing from some amazingly dangerous alternative universe with a massive energy cost on information? Or am I the victim of a sophisticated troll employing meaningless hyperbole?

  • by Anonymous Coward

    A little bit of heat

  • by mattr ( 78516 ) <mattr&telebody,com> on Sunday March 11, 2012 @09:10AM (#39317445) Homepage Journal

    Does this suggest that by saving up erasures to be done more slowly, perhaps by flipping bits to 0 near the time when they are flipped to 1, could energy be saved and the Landauer limit approached? Also, are there architectures in which a flipping a bit in one direction uses less power, or when blocks of bits can be deselected by some pointer instead of actually erased, trading memory hardware space for power usage?

  • by Prof.Phreak ( 584152 ) on Sunday March 11, 2012 @11:14AM (#39317935) Homepage

    By monitoring the position [AND] speed of the particle...

    Unpossible! Measure one or the other, but not both...

  • by Anonymous Coward

    "IBM Scientists Measure the Heat Emitted From Erasing a Single Bit"

    All of this seems like a bunch of hot smoke to me. Can't these scientists find something better to do with their pay?

  • So in an expanding universe there is a loss of information -- and by Landauer's principle this loss of information should release dissipated energy -- and Gough claims that this dissipated energy accounts for the dark energy component of the current standard model of universe.

    There are rational objections to this proposal. Landauer's principle is really an expression of entropy in information systems -- which can be mathematically modeled as though they were thermodynamic systems. It's a bold claim to sa

  • Back in the Uni library, I once had an old ('60's?) book in my hands which stated that for every logical AND circuit, combining two '1' bits would also result in heat. The author suggested designing AND circuits so taht they would have two results: the logical outcome, and the overflow 'exhaust', both connected to the rest of the circuitry. This would be used to keep the processor from generating heat, but might also have more practical, logical uses. (He probably said similar things for other kinds of circ

  • I remember reading in Bunnie Huang's book on Hacking the Xbox that a computer just enumerating 2**256 (let alone doing anything useful) would require enough power to boil the oceans.

    Maybe it wasn't 256, but it was related to cryptography.

  • I thought the energy to flip a bit was already measured in Quantum Computing devices as it tends to cause de-coherence?

    If not, then it should :-)

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...