Computational Origami and David Huffman 122
geeber writes "Here is an article about David Huffman's work in the mathematics of computational origami at the New York Times (soul sucking registration required). According to the article, computational origami, "also known as technical folding, or origami sekkei, draws on fields that include computational geometry, number theory, coding theory and linear algebra." David Huffman is also the inventor of Huffman coding used in MP3s and was mentioned prieviously here."
Well, more famous for huffman coding long before (Score:5, Informative)
Huffman coding was one of the first codings used to compress data LONG LONG time ago, in a galaxy far far away where MP3's were billions of years yet to come in the future.
It is real cool to see such pioneering people still involved in new things.
Mmmm.... Oragami (Score:5, Informative)
Non-Reg Link (Score:5, Informative)
Re:robot porn (Score:3, Informative)
An excellent explanation of Huffman coding... (Score:5, Informative)
What's especially nice is that the book walks you thru the various steps - minimum redundancy coding, adaptive huffman coding, arithmetic coding... so the improvements are introduced gradually and logically. Good stuff.
Re:Impressive... (Score:4, Informative)
Re:Mmmm.... Oragami (Score:5, Informative)
Re:What I liked best... (Score:5, Informative)
Re:OT: Drop the lame comments (General /. comment) (Score:5, Informative)
google link (Score:3, Informative)
Other Computational Origami Mathematicians (Score:5, Informative)
Re:Computational Folding (Score:3, Informative)
I'm tempted to flame you like hell but well, it's just that I've read TFA. My bad, sorry!
David Huffman (Score:5, Informative)
He also frequently gave credit to Claude Shannon on information coding.
Sadly (or fortunately) I avoided his other class, due to the fact that the failure rate was 60% for people taking the class for the second time. I think the first time takers failed at 90%.
-Aaron
Quick tutorial on Huffman Coding (Score:1, Informative)
Essentially, it breaks down to using your bits in such a way that the most common symbols are represented by the fewest number of bits. The result is a prefix-free code, meaning that no string of bits that represents a symbol is part of the beginning of any other symbol. You'll never get both "01" and "010" representing something in a Huffman Code.
A Huffman Code is optimal, meaning that it results in the shortest possible way to turn symbols into strings of bits with a one to one mapping between a symbol and the bits that represent it. As a result, it's used in a lot of compression methods.
There are better methods, like arithmetic coding. Huffman coding assumes an integral number of bits used for each symbol. Arithmetic coding results in a scheme where you end up with, essentially, fractions of bits being used. Naturally, this requires a lot more computation. Huffman coding is relatively cheap, computation-wise, and it's pretty easy to do. This is why it gets used so much.
The really big deal about Huffman Coding, however, is that you can *prove* that it's the most optimal method for doing what it does. That was Huffman's big accomplishment, really. Shannon-Fano coding is similar to (but not the same as) Huffman coding, but Huffman can be proven to be the optimal way of doing it.
Another article (Score:2, Informative)
Re:Impressive... (Score:4, Informative)
http://www.sgi.com/grafica/fold/page001.html
Re:Other Computational Origami Mathematicians (Score:3, Informative)
A good link to Lang's work is here [langorigami.com]