The Ultimate Limit of Moore's Law 418
BuzzSkyline writes "Physicists have found that there is an ultimate limit to the speed of calculations, regardless of any improvements in technology. According to the researchers who found the computation limit, the bound 'poses an absolute law of nature, just like the speed of light.' While many experts expect technological limits to kick in eventually, engineers always seem to find ways around such roadblocks. If the physicists are right, though, no technology could ever beat the ultimate limit they've calculated — which is about 10^16 times faster than today's fastest machines. At the current Moore's Law pace, computational speeds will hit the wall in 75 to 80 years. A paper describing the analysis, which relies on thermodynamics, quantum mechanics, and information theory, appeared in a recent issue of Physical Review Letters (abstract here)."
Might Prove A Vinge novel correct? (Score:3, Interesting)
about the nature of computation and lightspeed and the like as explored in the wonderful novel A Fire Upon The Deep (Zones of Thought) [amazon.com]
in which the universe has depth and the depth determines how fast things can go including neural tissue, computation, and intergalactic travel. I have long suspected that Earth is towards the shallow end ...
What is the limit? (Score:4, Interesting)
So what is that limit? What units would you express such a limit in? The fundamental unit of information is a bit, what is the fundamental unit of computation? Would you state that rate in "computations per second"? "Computations per second per cm^3"? "computations per second per gram?"
I checked out the pdf of the paper, and didn't see any numerical limit stated, just equations.
Reminds me of a joke (Score:5, Interesting)
A scientist and an engineer are lead into a room. They are asked to stand on one side. On the opposite side is Treasure (or delicious cake if you please).
They are told that they may have the prize if they can reach it, however they may never go more than half the distance between them and it.
The scientist balks claiming it is obviously impossible as he can NEVER reach the prize and leaves the room. The engineer shrugs, and walks halfway to the prize 10 times or so, says "close enough" and takes it.
So I guess we'll just see, eh?
Electricity cost comes first... (Score:4, Interesting)
At the current rate of progress, so to speak, no one will be able to afford a computer that runs 10^16 times faster than current systems. Even as a gamer, I'm already leery of buying any of the newer video cards and CPU setups, after reviewing the cost in electricity needed to run them for a year compared to my existing system - they use somewhere around 4 times the electricity!
I can understand fitting more transistors onto a chipset, and more chipsets onto a system, but even with nanotech and similar technologies, I don't see much chance for each transistor to use proportionally less electricity to allow 10^16 more of them to be running at once. You'd have to run a conductance cable to the sun to get that kind of power.
Ryan Fenton
Anyone else get the feeling... (Score:3, Interesting)
that the ultimate limit is the processes that the universe itself uses to "compute" its own state? That we can only ever asymptotically approach this limit? Once we hit the limit, our computations cease being simulations and become reality.
Re:Efficiency (Score:3, Interesting)
So we'll have to wait another 75 years before management lets us focus on application efficiency instead of throwing hardware at the performance problems? Sigh...
No, you still won't be doing performance optimizations if that's not what makes the most money...
Re:What is the limit? (Score:5, Interesting)
A more practical question: how many bits does my encryption key need now to make brute force cracking impractical for the fastest computer possible in this Universe (i.e probability of finding the key within my remaining lifespan 0.0001% (1 in a million))?
And not involving a system that reduces my lifespan, such as one failed attempt kills me, smart-ass.
Re:No growth can go on forever (Score:3, Interesting)
and no exponential growth can go on for just a comparatively very short time. This should be self-evident, but for some reason, people seem to ignore that. Especially people who call themselves journalists or economists.
As far as we know the expansion of the universe and entropy will go on forever.
I doubt that is what you mean though...
Self contained systems do have limits unless of course they are self recursive and halo-graphic.
Like fractals and information...
Economies and ecosystems are not.
Constrained Freedom (Score:3, Interesting)
per TFabstract: "errors that appear as a result of the interaction of the information-carrying system with uncontrolled degrees of freedom must be corrected."
Would not quantum teleportation via entanglement provide a means of distributing computation to include massively parallel? Quantum teleportation would provide a constraint that would redefine the problem by redefining the environment (ie. uncontrolled degrees of freedom). Replace Moore's Law with Bell's Theorem.
And does not quantum computing operate on all possible states, with the answer inherent in the wave function? Spew out the entangled qubits as needed and let them fight it out as a quantum form of Swarm.
If a result can be obtained this way, you may still have a problem with simultaneity -- the answer may arrive "before" the question, making it impossible to decode. However the problem then becomes a limitation of spacetime's ability to pass definitive information, and the limit of computation itself if such exists and/or can be measured in this context becomes moot. Being able to error trace via backtrack is similarly hampered but for the same reason and would still be possible post hoc.
But if a computational system is devised that can operate on such principles, and it is to be used for practical calculations, be aware that any defining of arguments will be restricted to the input end and results for comparison and decision making may not yet be available for such decisions (assuming a reasonable latitude of autonomous action). In which case, make sure you teach it phenomenology *before* putting it to work.
Re:The problem is... (Score:4, Interesting)
As countless of such laments throughout recorded history have shown, worries about intellectual demise of the youth are greatly overblown.
Re:What is the limit? (Score:3, Interesting)
The Von Neumann-Landauer limit [wikipedia.org] suggests that each bit of information lost requires ln(2)*kT of energy which is released as heat. Assuming that, as a minimum, it is necessary to flip through all the bits in the key to brute force it then a 512 bit key which requires 2^512 - 1 bit flips corresponding to an energy of 4*10^133 Joules [google.co.uk] if done at room temperature or around 6*10^121 Joules [google.co.uk] if done at the coldest temperature yet achieved which is 450 pico Kelvin. Given a universe of mass 1.6*10^55 kg [hypertextbook.com] Einstein's relation suggests than if this were entirely converted to energy then only 1.4*10^72 Joules [google.co.uk] would be available. So even converting the entire universe to energy and using it to run your computer you'd still fall short (by a factor of 10^49) of the energy required to brute force that 512 bit key.
Re:Transistors Per IC and Planck Time (Score:4, Interesting)
Planck Time lives in just four dimensions. Imagine opening up, say the sixth dimension, figuring out the vectors for the next WoW move, then plugging them in to realtime in the four we live in.
Information exists in so many states at once. Von Neumann liked to talk about Hilbert space.... I think that's where my data lives.