Progress In Algorithms Beats Moore's Law 166
Relic of the Future writes "Seen on the blog 'Algorithmic Game Theory,' a report to congress and the president about past and future advances in information technology notes that, while improvements in hardware accounted for an approximate 1,000 fold increase in calculation speed over a 15-year time-span, improvements in algorithms accounted for an over 43,000 fold increase."
Not so much (Score:5, Insightful)
45 years later (Score:4, Insightful)
Re:But we made up in ... (Score:5, Insightful)
On the flip side, people tend to be turned off when they see what my screen looks like. It has gotten to the point where I do not mention that this is "Linux," because I do not want new users to get scared. In the end, looking pretty winds up being more important than how efficiently you use computational resources.
Re:Not so much (Score:4, Insightful)
And in fact the article mentions just one specific problem where algorithmic improvements made it about 43.000 times faster on one specific input.
I think a more general issue is citing a constant as the speedup for an algorithm. Improvements in algorithms often refer to improvements in the asymptotic efficiency, rather than constant factors. If I asked you how much of an improvement a switch from selection sort to merge sort would be, you would not tell me "it is 1000 times faster" because that would be wrong (except for a particular input size).
Moral of the story... (Score:3, Insightful)
OPTIMIZE YOUR SHIT.
Because tons of hardware is useless if you can't push it to it's limits with efficient code and algorithms.
Game and OS designers, I'm looking right at you.
Re:Not so much (Score:4, Insightful)
The grandparent didn't say "clock speed doubles". It said "processing power doubles". In the last few years those things have been very distinct, in some cases processing power increasing while clock speeds have actually become lower.