Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Software Math Programming Technology

Progress In Algorithms Beats Moore's Law 166

Posted by timothy
from the when-bottlenecks-swap dept.
Relic of the Future writes "Seen on the blog 'Algorithmic Game Theory,' a report to congress and the president about past and future advances in information technology notes that, while improvements in hardware accounted for an approximate 1,000 fold increase in calculation speed over a 15-year time-span, improvements in algorithms accounted for an over 43,000 fold increase."
This discussion has been archived. No new comments can be posted.

Progress In Algorithms Beats Moore's Law

Comments Filter:
  • 1000 fold (Score:5, Interesting)

    by MichaelSmith (789609) on Friday December 24, 2010 @07:56PM (#34663104) Homepage Journal

    Thats an interesting figure because when the alpha came out DEC were predicting 1000 fold improvement in speed over about the same period. They split the improvement over factors of ten: clock speed, parallelism in the CPU and multiple cores were each expected to deliver a factor of ten improvement.

    Now while our algorithms might be getting better our programmers definitely are not. A lot of those improved algorithms must be in our APIs. Like the way its easy to use a Hashtable in java now but in the last you might have just searched a linear array.

  • Re:1000 fold (Score:5, Interesting)

    by jc42 (318812) on Friday December 24, 2010 @10:26PM (#34663738) Homepage Journal

    Now while our algorithms might be getting better our programmers definitely are not.

    Sometimes this is intentional. One of the more fun software competitions I've run across is to write the slowest sorting algorithm. Trivial programming tricks such as do-nothing loops that just run a counter to use time disqualify you, obviously. The judging is solely on the algorithm, by comparing the O(N) functions. One of the winners in the early years was a sort that generated a random permutation of the data, and tested it for sortedness, repeating if it wasn't sorted. And it turns out that there are several progressively worse variants of this, depending on how ridiculous your random-permutation algorithm is.

    I should go find this contest again, and see what sorting atrocities they've come up with in the past couple years ...

    (I did once read a suggestion that such contests not be widely publicised, out of fear that managers will find them and declare the winning algorithms the new company standards. ;-)

  • Re:1000 fold (Score:4, Interesting)

    by sco08y (615665) on Friday December 24, 2010 @11:47PM (#34663990)

    Thats an interesting figure because when the alpha came out DEC were predicting 1000 fold improvement in speed over about the same period. They split the improvement over factors of ten: clock speed, parallelism in the CPU and multiple cores were each expected to deliver a factor of ten improvement.

    Now while our algorithms might be getting better our programmers definitely are not. A lot of those improved algorithms must be in our APIs. Like the way its easy to use a Hashtable in java now but in the last you might have just searched a linear array.

    There are more programmers because programming is becoming more accessible, so naturally more people who suck at programming are doing it.

    There's another problem: while gains from Moore's law have historically been realized by a recompile or less, most new algorithms require actually changing the code.

There must be more to life than having everything. -- Maurice Sendak

Working...