Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software Math Programming Technology

Progress In Algorithms Beats Moore's Law 166

Relic of the Future writes "Seen on the blog 'Algorithmic Game Theory,' a report to congress and the president about past and future advances in information technology notes that, while improvements in hardware accounted for an approximate 1,000 fold increase in calculation speed over a 15-year time-span, improvements in algorithms accounted for an over 43,000 fold increase."
This discussion has been archived. No new comments can be posted.

Progress In Algorithms Beats Moore's Law

Comments Filter:
  • Not so much (Score:5, Insightful)

    by kasperd ( 592156 ) on Friday December 24, 2010 @08:58PM (#34663122) Homepage Journal
    When hardware gets faster it applies more or less uniformly to all algorithms you run on that hardware. When an algorithm is improved it applies to just one specific problem. For some problems we already knew close to optimal algorithms 15 years ago, in those cases there was no way algorithmic improvements could have achieved such a speedup. And in fact the article mentions just one specific problem where algorithmic improvements made it about 43.000 times faster on one specific input. In other words, this number is not nearly as general as the summary makes it sound. The are algorithms that were not speeded up nearly as much, and for any algorithms that were bad enough to begin with, there could have been an even larger speedup - without anybody making a fuss about it.
  • 45 years later (Score:4, Insightful)

    by zeroRenegade ( 1475839 ) on Friday December 24, 2010 @09:04PM (#34663154)
    Moore's law is based more on economics than technology. He was just a hands on executive who had a clear vision of what the future would bring. Anybody who actually has read his original paper would realize this. The fact that his theory has held up this long is incredulous.
  • by betterunixthanunix ( 980855 ) on Friday December 24, 2010 @09:20PM (#34663210)
    Well, the real question is, are users better off as a result? Personally, I found that most of the computationally intensive graphics were not actually helping me get my work done any faster, switched to a more minimal window manager and did all that I could to reduce the amount of CPU time spent on rendering the GUI (disabling special effects wherever I saw them), and discovered that, in fact, I was correct: none of those effects were actually helping me get my work done.

    On the flip side, people tend to be turned off when they see what my screen looks like. It has gotten to the point where I do not mention that this is "Linux," because I do not want new users to get scared. In the end, looking pretty winds up being more important than how efficiently you use computational resources.
  • Re:Not so much (Score:4, Insightful)

    by betterunixthanunix ( 980855 ) on Friday December 24, 2010 @09:34PM (#34663268)

    And in fact the article mentions just one specific problem where algorithmic improvements made it about 43.000 times faster on one specific input.

    I think a more general issue is citing a constant as the speedup for an algorithm. Improvements in algorithms often refer to improvements in the asymptotic efficiency, rather than constant factors. If I asked you how much of an improvement a switch from selection sort to merge sort would be, you would not tell me "it is 1000 times faster" because that would be wrong (except for a particular input size).

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Friday December 24, 2010 @09:57PM (#34663380) Homepage Journal

    OPTIMIZE YOUR SHIT.

    Because tons of hardware is useless if you can't push it to it's limits with efficient code and algorithms.

    Game and OS designers, I'm looking right at you.

  • Re:Not so much (Score:4, Insightful)

    by SLi ( 132609 ) on Friday December 24, 2010 @10:11PM (#34663462)

    The grandparent didn't say "clock speed doubles". It said "processing power doubles". In the last few years those things have been very distinct, in some cases processing power increasing while clock speeds have actually become lower.

Everybody likes a kidder, but nobody lends him money. -- Arthur Miller

Working...