Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IBM Science

IBM Scientists Measure the Heat Emitted From Erasing a Single Bit 111

ananyo writes "In 1961, IBM physicist Rolf Landauer argued that to reset one bit of information — say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0 — must release a certain minimum amount of heat, proportional to the ambient temperature. New work has now finally confirmed that Landauer was right. To test the principle, the researchers created a simple two-state bit: a single microscopic silica bead held in a 'light trap' by a laser beam. (Abstract) The trap contains two 'valleys' where the particle can rest, one representing a 1 and the other a 0. It could jump between the two if the energy 'hill' separating them is not too high. The researchers could control this height by changing the power of the laser, and could 'tilt' the two valleys to tip the bead into one of them by moving the physical cell containing the bead slightly out of the laser's focus. By monitoring the position and speed of the particle during a cycle of switching and resetting the bit, they could calculate how much energy was dissipated."
This discussion has been archived. No new comments can be posted.

IBM Scientists Measure the Heat Emitted From Erasing a Single Bit

Comments Filter:
  • Re:Spirit (Score:5, Insightful)

    by MacTO ( 1161105 ) on Sunday March 11, 2012 @04:45AM (#39316807)

    It probably reflects the spirit of Landauer's claims. Claims such as this depend upon an understanding of physics, which was much more common in computing back in the days when innovation depended upon an understanding of physics in order to develop new hardware. You also have to consider that a variety of different techniques were used to make computer memories back then, so his claims had to be based upon the underlying physics rather than a particular memory technology. So it is fair game to apply different physical models to prove his claims.

  • by Avoiderman ( 82105 ) on Sunday March 11, 2012 @06:03AM (#39316983)

    Oh it has a law on Wikipedia, must be a waste of time to test or verify it then! Seriously, have a read about how science works before attempting to comment again. A "law" in science is not like a legal law - i.e. it is not a fact merely by self-assertion (a legal law is a law because law makers say so). Scientific "laws" require test and proof; they often require refinement in details. Scientific "laws" do not exist as abstract facts about the universe - they are human attempts to model the universe from the knowledge we currently have. Our limited knowledge means that the detail may be imperfect. A quick survey of the history of science demonstrates that we often get them wrong.

    I'm not attempting to challenge the "laws" of thermodynamics - my guess would be that we have the broad picture right (we have a lot of evidence in favour), but again, given the history of science I would be surprised if every detail of taught theory in that area survives the next few hundred years without some modification.

    Yes the scientists doing this probably expected some heat to be measured. They were more interested in precisely how much. This is science - an ongoing process.

  • by Anonymous Coward on Sunday March 11, 2012 @07:16AM (#39317167)

    its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.

    Now, let X be some sort of encryption complexity. now do u see how it could be useful?

  • by Mr Thinly Sliced ( 73041 ) on Sunday March 11, 2012 @07:48AM (#39317249) Journal

    Your example only erases in one direction.

    To be correct, your experiment must complete erasure in both directions (0 to 1, 1 to 0).

    As such, I think you'll find going the other direction is radically more difficult to get energy neutral since you'll be trying to keep thermal bleed from happening whilst flipping your bit.

  • by Kjella ( 173770 ) on Sunday March 11, 2012 @10:11AM (#39317649) Homepage

    Not really that surprising, a silicon atom is about 0.11nm and the lattice grid in a silicon crystal 0.54nm, which is still way smaller than the 32nm processors he's talking about. I don't know how many electrons flow down each 32nm path but they're between 0.1nm and 0.000006nm in diameter depending on what model you use - quantum mechanics makes a mess of this anyway - so it's way more than one. If you want single electron calculations you'll have single electron signals, one quantum event and your signal is lost. So the limit is likely to remain a very theoretical limit.

    The other thing is that this only includes the operation itself, no clock cycle, no instruction pointer, no caching, prefetching, branching, this is the ideal you could get out of a fixed-function ASIC that only does one thing, not even as programmable as a GPU shader. We already know that there's a significant gain to that, but even supercomputers aren't built that specifically to the task. Formulas must be tweaked, models adjusted, parts must be able to be used in many computers. We've already seen that a GPGPU can beat a CPU by far on some tasks, but even they aren't close to such an ideal.

    If you think about this in encryption terms it's not that much... it says you can at most improve 23-24 bits, in encryption most have used the Landauer limit to "prove" there's not enough energy to break a 256 bit chipher by brute force. In some places I don't think it's that relevant either, in for example mobile I think the energy involved in bandwidth use will be more significant. Want to stream a HD movie? It's not the decoding that kills the battery, it's the 3/4G data connection. Just like cameras get better but good optics still isn't small, light or cheap.

  • Re:Spirit (Score:5, Insightful)

    by jpate ( 1356395 ) on Sunday March 11, 2012 @10:58AM (#39317857) Homepage
    Landauer's claim was about the relationship between entropy as used in information theory and entropy as used in thermodynamics: specifically, that entropy in information theory is identical to the entropy in thermodynamics. The scientists used this set-up so they could measure a change of exactly one bit (the information-theoretic conception of entropy) while controlling outside heat influences (the thermodynamics conception of entropy), and see if the change in information corresponded to the change in heat as predicted by thermodynamics and information theory.

    Without precisely controlling the change in information and precisely measuring the change in heat, the result is much less clear. That's why they used this methodology and equipment. Moreover, as this is empirical evidence for a very general identity between heat and information, the result will hold for computer memory as well.
  • Re:Spirit (Score:5, Insightful)

    by canajin56 ( 660655 ) on Sunday March 11, 2012 @03:03PM (#39319073)

    It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.

    Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.

    Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org], which is somewhat related, and pretty cool.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...