Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Math

Mathematician Solves 48-Year-Old Problem, Finds New Way To Multiply (popularmechanics.com) 107

An anonymous reader quotes Popular Mechanics: An assistant professor from the University of New South Wales Sydney in Australia has developed a new method for multiplying giant numbers together that's more efficient than the "long multiplication" so many are taught at an early age. "More technically, we have proved a 1971 conjecture of Schönhage and Strassen about the complexity of integer multiplication," associate professor David Harvey says in this video...

Schönhage and Strassen predicted that an algorithm multiplying n-digit numbers using n * log(n) basic operations should exist, Harvey says. His paper is the first known proof that it does...

The [original 1971] Schönhage-Strassen method is very fast, Harvey says. If a computer were to use the squared method taught in school on a problem where two numbers had a billion digits each, it would take months. A computer using the Schönhage-Strassen method could do so in 30 seconds. But if the numbers keep rising into the trillions and beyond, the algorithm developed by Harvey and collaborator Joris van der Hoeven at École Polytechnique in France could find solutions faster than the 1971 Schönhage-Strassen algorithm.

"It means you can do all sorts of arithmetic more efficiently, for example division and square roots," he says. "You could also calculate digits of pi more efficiently than before. It even has applications to problems involving huge prime numbers.

"The question is, how deep does n have to be for this algorithm to actually be faster than the previous algorithms?" the assistant professor says in the video. "The answer is we don't know.

"It could be billions of digits. It could be trillions. It could be much bigger than that. We really have no idea at this point."
This discussion has been archived. No new comments can be posted.

Mathematician Solves 48-Year-Old Problem, Finds New Way To Multiply

Comments Filter:
  • Donald Knuth tries to work on problems that have some kind of practical application.
    • Where exactly is this not practical?
      • by Jeremy Erwin ( 2054 ) on Sunday October 20, 2019 @01:37PM (#59328334) Journal

        This algorithm is mentioned in wikipedias article on Galactic algorithms [wikipedia.org]

        An example of a galactic algorithm is the fastest known way to multiply two numbers,[2] which is based on a 1729-dimensional Fourier transform.[3][4] This means it will not reach its stated efficiency until the numbers have at least 2^1729 digits, which is vastly larger than the number of atoms in the known universe. So this algorithm is never used in practice.

        • How do they know how many atoms are in the universe?
          • by dvice ( 6309704 )
            You take the mass of the universe and divide it with the mass of a single atom.
            • by rossdee ( 243626 )

              "divide it with the mass of a single atom."

              An atom of iron is a lot heavier than an atom of hydrogen.

              And what about dark matter?

              • An atom of iron is a lot heavier than an atom of hydrogen.

                We have pretty good idea of the ratio of the elements.

                And what about dark matter?

                We don't know what dark matter is, but we know its mass by its gravity.

                Look, we don't know the exact mass of the universe down to the nearest nanogram, but we do have a rough idea of the mass of the universe within the radius we can observe. Not exactly, but within plus or minus a few hundred quintillion solar masses, which is close enough for most purposes.

                • I also know the size of planet earth from the whereabouts that I can see through the windows.

                • ... it will not reach its stated efficiency until the numbers have at least 2^1729 digits, which is vastly larger than the number of atoms in the known universe.

                  How do they know how many atoms are in the universe?

                  You take the mass of the universe and divide it with the mass of a single atom

                  An atom of iron is a lot heavier than an atom of hydrogen.

                  Uh, when a number has 2^1729 digits, it is so large that you can multiply it by a million and it still has 2^1729 digits. It doesn't matter whether you're talking about iron or hydrogen, it's the same number

                  And what about dark matter?

                  Unless the dark matter outweighs the non-dark matter by a factor that's noticeable on the scale of a number with 2^1729 digits, it's irrelevant.

                  • when a number has 2^1729 digits, it is so large that you can multiply it by a million and it still has 2^1729 digits

                    Not if you agree to take into account the decimals of that 1729 :-)

              • "divide it with the mass of a single atom."

                An atom of iron is a lot heavier than an atom of hydrogen.

                And what about dark matter?

                Most matter is dark matter.

                Most mass is dark energy. Dark matter is still provincial, and normal matter a creaky board on an old man's fishing trawler that docks at a port down in the sea province.

          • by lgw ( 121541 )

            When someone says "universe" they almost always mean "observable universe", it's just that the latter is a mouthful.

            • When someone says "universe" they almost always mean "observable universe", it's just that the latter is a mouthful.

              No we generally include the unobservable parts too. That is what is called dark matter and dark energy. We still have a pretty good idea about what the total mass is, which is how dark matter and dark energy came to be a thing, it is all the mass we can observe the universe react, but which we can't observe.

          • They don't know.

            It is a Scientific Wild Ass Guess [wikipedia.org]

            Basically a number someone pulled out of their ass that sounds "correct."

            • It's like a magician saying, "Is this your card?" and never pulling one out of the deck to show you first.

              • Untrue. It's more akin to a magician saying, "I'm not showing you the card, but I'm showing you evidence that this is another person's card to a confidence level of better-than-guessing based on the data we have."
            • Getting a precise value is impossible, but it isn't hard at all to place an upper and lower bound on these things, with a very high level of certainty.

          • Additionally, given an estimate on the density of the universe you could put some sort of upper bounds on the size, due to the Schwarzchild radius. Basically, if space continues on with (any) set density for far enough, eventually there's enough mass for the whole thing to collapse into a black hole (be inside an event horizon already), so you could end up with a contradiction if you assume the universe is bigger than some upper limit.

          • It only takes two facts, quick google searches to place an upper bound on this.

            Volume of known universe: 4E80 cubic meters
            Volume of hydrogen atom: 6E-31 cubic meters

            Divide the universe up according to the size of an individual hydrogen atom (the smallest atom), and you end up with about 7E110 atoms. That is approximately 2^360. So... 2^1729 is vastly, vastly larger than the number of atoms in the known universe.

            That said, this number is peanuts compared to some of the numbers that are routinely used by math

        • Note that it is significant digits, not just digits.
        • Spock: A mathematical proof that something is not practical. Fascinating.

        • I guess you know that maths are used for more things that just count things.

          Sometimes just we transform data into numbers, because this allow to make some kind of data transform.

          And this kind of numbers are just not so large. It represent 1729 bits of info, you known.

    • by gtall ( 79522 )

      Damn, you are right. That stupid dumbass Einstein should have been concentrating on more mundane stuff of practical application. And the mathematicians who did the ground work that allowed him to express his ideas, what the hell were they thinking. Come to it, that dumbass Hilbert and his useless Hilbert space, what a waste of time that was. All it did was allow the quantum physicists to monkey around with useless quantum theory when they could have been screwing around with, I don't know, maybe radios or t

    • by godrik ( 1287354 )

      Well, this may not be useful today. But it is a fundamental results that is worthy of praise.
      Strassen's original matrix multiplication was also originally as something that would not be immediately useful. But modern linear algebra libraries now use it for large sizes.

      This is a first complexity result. It will take time before the algorithm is made practical. But it will come.
      I remember the first approximation algorithm for 3d packing problem being ridiculous, a 1000-approximation algorithm in O(N^100) or s

    • Yeah, right, he only works on stuff that has a practical application. And the best example of that is the highly inefficient form of Bubble Sort that he published because he could fit it into six lines of that assembly language he used for examples. Even back when he wrote that, computers had enough memory that nobody ever needed to implement it except as a joke.
      • He didn't publish it, if you are talking about the thesis from Stanford.
        • No, I'm talking about the minimalist sorting algorithm he published in volume three of The Art of Computer Programming, Searching and Sorting.
          • Did you learn from it?
          • by Shaitan ( 22585 )

            Seriously you are nagging on the utility of one algorithm from The Art of Computer Programming?

            For those who don't know he invented a compact language to express them in and then packed half a book shelf of dense algorithm filled tomes the entire collection of which is titled "The Art of Computer Programming" I don't know how many algorithms are in the set in its entirety but this akin complaining that there was a less than optimal entry on page 234 of volume F of the Encyclopedia Britannica.

            • Seriously you are nagging on the utility of one algorithm from The Art of Computer Programming?

              It's a joke, son, laugh!
      • actually, a lot of embedded hardware is still memory limited - not everything is a desktop/laptop.
    • Thus his division of finite numbers into the realistic and the not so realistic.
      D.E. Knuth "Mathematics and Computer Science: Coping with Finiteness" [sciacchitano.it]. Science 17 December 1976, vol. 194, n. 4271, pp. 1235-1242.

  • by gweihir ( 88907 ) on Sunday October 20, 2019 @12:49PM (#59328192)

    Quite unlike the short-lived throw-away crap "research" happening in CS these days. You will also be hard pressed to find any CS professor actually doing meaningful research today. All they are doing is looking for funding to do "something" with the latest hype....

    • +1 Accurate.

    • That's a good sign you're looking at a second-rate school.

    • Yes, and also there may be deeper implications for understanding number systems and fundamentally how our universe works .

      If it's correct and ludicrously faster, then it's true and our old understanding was a crutch to get us this step closer.

      There's a reason people float back and forth across math, physics, and philosophy. Sean Carrol even said that most physicists who want to do fundamental "but, why?" research are drummed out of Orthodox Physics and go hide out in the Philosophy department.

    • by lgw ( 121541 )

      Oh, dear me, professors are wasting their time doing "research" of practical value to humanity. How very dare they! They should all do research that will never benefit anyone at all, for only that is pure and unsullied.

      • I believe gweihir's point was that most research benefits no one -- long or short term -- and the really deep breakthroughs in fundamentals get even less name service than practical research.
        • by lgw ( 121541 )

          Yes, but he's just wrong. Most research is reasonably practical, which is why the government funds it. Oh, sure, there are outliers like modern particle physics, but even then: learning how to build the LHC had practical benefits. The experimental side of just about any field is like that, breaking new technological ground in some area just to enable the experiment.

          I think people are confusing research with development. Just because research doesn't result in a product doesn't mean it's not practical.

      • He literally wrote 3 sentences and you completely missed the point.

      • Drumlin: I feel science should offer practical solutions for The People, who, after all, are paying the bill.

        Astronomer: Not unlike my L-band globular clusters.

    • CS researchers can only seek funding from those who want to fund. It may be crap research, but someone is funding grants for them to do that research.

  • As near as I can tell, this is NOT what the smbc comic from a week ago was talking about. This article is talking about the The Schönhage–Strassen method of multiplying numbers faster. The comic was talking about improving the Coppersmith–Winograd algorithm to multiply numbers faster. Which was itself an improvement of an algorithim also called the Strassen algorithm that was not the same thing? I think.

    https://www.smbc-comics.com/co... [smbc-comics.com]

    • You're right, though both are named after Volker Strassen. The Coppersmith–Winograd algorithm is for matrix multiplication, and *theoretically* beats Strassen's matrix multiplication algorithm, but only for matrices so huge no existing computer could process them, so it's just theoretical. Strassen's algorithm is still better for *realistically* large matrices.

      Schönhage–Strassen by comparison is for multiplying two large scalar numbers together, not matrices. But similar to the Coppersmith

  • Awesome (Score:5, Informative)

    by JustAnotherOldGuy ( 4145623 ) on Sunday October 20, 2019 @12:51PM (#59328200) Journal

    "It even has applications to problems involving huge prime numbers."

    So...lots of cryptographic uses.

    • by E-Rock ( 84950 )

      Yea, that's where it went from interesting to whoa.

    • I was thinking about that too. But re-reading the summary, this new algo only helps on really big numbers "It could be billions of digits. It could be trillions." Since our current crypto keys aren't in the gigabytes, it seems that this isn't a threat to security.
    • Everything in number theory has applications involving huge prime numbers, and cryptographic uses.
    • "It even has applications to problems involving huge prime numbers."

      So...lots of cryptographic uses.

      Only if you don't mind using keys that are gigabytes in size. For smaller numbers, existing techniques are at least as fast.

    • "It even has applications to problems involving huge prime numbers."

      So...lots of cryptographic uses.

      No at least not a for a couple of centuries with the current progress of computer technology. The numbers used even in cryptology are not large enough for this to be faster.

  • by Viol8 ( 599362 ) on Sunday October 20, 2019 @12:52PM (#59328208) Homepage

    It really rams home the point that no matter how smart I think I am, there are people out there whose intellect is as far beyond me in certain areas of thought as I am beyond my neighbours dog. I don't even begin to understand the maths behind it , but kudos to the guy for figuring this out.

    • by DogDude ( 805747 )
      Absolutely. I thought the same thing, so I decided to go back to school to study math & genetics. I'm still very, very, very dumb compared to some people.
      • Re: (Score:2, Insightful)

        by Shaitan ( 22585 )

        Studying something can't make you smart. Either you are or you aren't. Studying something can only make you sound smart usually in the form of doing nothing but regurgitating information already thought out by others which you studied, possibly to someone far more intelligent than you who simply hasn't read or didn't find value in remembering that material. It's a parlor trick, Phd's and other academics love it.

        If you ever meet actual genius you'll find they aren't insecure enough to need to put you down to

        • It's a parlor trick, Phd's and other academics love it.

          I find that people who make such sweeping generalisations about PhDs and academics don't usually know many of either. Especially when the generalisation is a variation on "you suck".

          If you ever meet actual genius you'll find they aren't insecure enough to need to put you down to lift themselves up

          Well, no *true* scotsgenius at any rate.

          I suspect (I can't guarantee that I've met any actual genii) that those personal attributes are probably unrelated to

          • by Shaitan ( 22585 )

            "I find that people who make such sweeping generalisations about PhDs and academics don't usually know many of either."

            "Well, no *true* scot"

            In sequential lines no less. Bravo sir. I'm sure you do find that such people don't know or talk to many and that you make that determination on their behalf in summary fashion as you did here. I'm sure I read something about starting an inquiry with a conclusion being a poor approach somewhere; something about bias. Oh well it is probably safe to disregard whatever no

            • In sequential lines no less. Bravo sir.

              Yes, because that's not how the no true Scotsman fallacy works.

              Academics and PhDs don't generally like regurgitating information (maybe excluding undergrad classes, there are plenty of lazy profs). You do realise that a PhD is a research degree and is only awarded to someone who has done original research.

              Try going to a conference with your parlour trick and see how many PhDs and academics love it.

              Phdii

              the cod Latin joke doesn't work if the word doesn't end in "us".

              • by Shaitan ( 22585 )

                "Yes, because that's not how the no true Scotsman fallacy works."

                Not a literal enough application for your parser? A claim that people who disagree with you aren't true or pure enough examples of a class entitled to an opinion IS an example of the logical principle contained in the fallacy. They aren't computationally compatible codes they are philosophical principles. In this case you set the stage for an ever expanding set of criteria or goalposts which will in turn violate another principle of logical re

                • Not a literal enough application for your parser?

                  No that's not how it works.

                  A claim that people who disagree with you

                  I was taking a dig at you specifically. You clearly don't know what you're talking about and are taking random digs at people you don't know much about. You have a massive chip on your shoulder and should probably see a doctor about getting it removed.

                  You seem to be confusing a degree requirement with interaction with others.

                  Yeah and you've never seen many academics and PhDs interact with e

        • by Viol8 ( 599362 )

          "Smart" isn't a binary thing - IQ is a broad spectrum. You need a certain level of intellect to even understand some subjects such as physics, maths, engineering, CS etc never mind come up with new theories. And for the latter its not just a question of IQ - a certain about of creativity and lateral thinking is often required too.

          Sure, for some subjects such as law, languages and other fluffy subjects (though not art and music, they require talent) its little more than memory and parroting what you've learn

          • by Shaitan ( 22585 )

            ""Smart" isn't a binary thing - IQ is a broad spectrum. You need a certain level of intellect to even understand some subjects such as physics, maths, engineering, CS etc never mind come up with new theories"

            My assertion first was that studying those subjects is not what makes for intelligence. That person was intelligent before learning those topics and would be no less intelligent if they dropped out of high school and pursued a career in fast food, bitching about the system, and getting high every night.

        • by iwbcman ( 603788 )

          Not studying something means you don't know what the fuck you are talking about. A smart person who studies something may grasp certain things quicker, may find somethings to be quite intuitive, may not have to struggle as much with certain concepts as some other people do. But without study, "smart" means basically nothing at all.

          Studying something can't make you smart. Either you are or you aren't. Studying something can only make you sound smart usually in the form of doing nothing but regurgitating information already thought out by others which you studied, possibly to someone far more intelligent than you who simply hasn't read or didn't find value in remembering that material. It's a parlor trick, Phd's and other academics love it.

          If you ever meet actual genius you'll find they aren't insecure enough to need to put you down to lift themselves up and they can extrapolate insights even from the abstract high level thinking of ignorant people around them.

          Now study can take many different forms: some people, usually referred to as autodidacts, study things on their own, absent "formal" study, others study "formally", ie. taking cl

          • by Shaitan ( 22585 )

            "But without study, "smart" means basically nothing at all."

            That is a different topic. Especially as you've defined study further down in your remark. In the hands of a "smart" person all of life is study forever growing their toolbox of material to apply to new and different situations. Even the problem you mention with shallower study can be advantageous, often we find that there are large gains and leaps made by individuals who cross disciplines and marry concepts brought from one to another. The shallow

        • by pjt33 ( 739471 )

          Studying something can't make you smart.

          Nonsense. Taking lessons in how to use an iron can make you much smarter.

    • by Shaitan ( 22585 )

      Smart might be understanding difficult to parse and follow math. That is mostly a function of working memory. Genius is the ability to recognize you'd be dumb to try since there is no practical use for a theoretical speed advantage that only applies when multiplying trillion digit numbers... even for a computer. Or rather there is no practical application. There is a practical use for Knuth, it earns him an obligatory honorable mention or footnote in any serious discussion about computing multiplication.

      • by Wulf2k ( 4703573 )

        Playing around with useless shit is a very useful way to discover useful shit.

        • by Shaitan ( 22585 )

          It is certainly A way to stumble onto useful shit at least. Another way is to chase after things that might be useful in an attempt to find out if it actually is. One could argue the latter path has a higher probability of successfully finding useful shit and an at least equal probability of stumbling onto useful shit.

  • to add this algorithm to Windows calculator.
  • by gnasher719 ( 869701 ) on Sunday October 20, 2019 @01:05PM (#59328248)
    It's basically just an FFT. If you split your numbers into n groups of k digits each, it takes c * n * log n multiplications of (k + log_10 n) digit numbers. It grows faster than n log n because the numbers you need to multiply grow with n. Still, a 192 x 192 bit product will get you quite far. Probably far enough to cover anything that you can run on any existing computer.

    If he has an a O (n log n) algorithm, and we compare the times for one billion digit numbers, you need some awfully big numbers to get over any advantages of the FFT algorithm.
    • by Mal-2 ( 675116 )

      Awfully big numbers may be required to stay ahead of quantum decryption, at least temporarily.

      • This doesn't have anything to do with quantum computers. Quantum computers give an efficient method of *factoring* https://en.wikipedia.org/wiki/Shor's_algorithm [wikipedia.org], which is conjecturally hard in a classical context. This is about multiplying; the general belief is that multiplying numbers on a quantum computer should be about as tough as on a classical system, but since we can't even prove that there's no O(n) method for classically multiplying two n numbers, anything about the quantum case is strongly conje
        • by Mal-2 ( 675116 )

          I get that. But the bigger the input numbers are, the more qubits it's going to take to disentangle them. Simply keeping the numbers larger than a quantum computer can handle will hold the gate for the moment, although it won't be a long-term fix if everything is being stored for later decryption.

      • Awfully big numbers may be required to stay ahead of quantum decryption, at least temporarily.

        We already have better solutions.

        To get numbers of a size that this method is more useful than existing method you'd need keys that are gigabytes in size. We already have quantum resistant algorithms [wikipedia.org] that are far more practical than that. One of the major complaints about today's quantum resistant algorithms is that their keys are too large... but "large" here means tens of kilobytes to megabytes.

    • by DavenH ( 1065780 )
      I was going to bring this up. FFT has been used to multiply big numbers for a long while at approximately n log n since forever.
    • by jmv ( 93421 )

      The more digits you have, the more bits you need to get sufficient accuracy in your FFT. So your FFT isn't *quite* O(N*log(N)) in that context. Once you consider that, I believe your FFT becomes slower than the SchÃnhageâ"Strassen algorithm. Still much faster than the naive O(N^2) algorithm of course.

  • What is Pi squared?

  • by UnknownSoldier ( 67820 ) on Sunday October 20, 2019 @03:26PM (#59328604)

    Here is the actual link to the pdf:

    Integer multiplication in time O(n log n) [archives-ouvertes.fr]

    Thanks /. editors. NOT! /s

    • by godrik ( 1287354 )

      I remember having talked about this paper back in March with someone who understand Joris' work much better than I do. Talk about Slashdot being late on the news...

    • Thanks /. editors. NOT! /s

      I'm not sure how to parse this. You have the "NOT!" and then the "/s". Does the /s cancel the NOT!, meaning you're seriously thanking /. editors? Is the /s additive and cumulative, implying even greater sarcasm?

  • You can find the actual paper explaining the new mechanism (not the simple video explaining how we ordinarily multiply and what log (n) means) here [archives-ouvertes.fr]. The actual mechanism is presented in Section 5, but it is interleaved with proofs of its complexity (not easy to follow). Note that the paper is not peer-reviewed.

  • Congrats Joris! Long time no TeXmacs ðY'
  • by eric31415927 ( 861917 ) on Sunday October 20, 2019 @09:56PM (#59329432)

    One must simply premultiply all of the integers (all the way to infinity and beyond!) and store them in a database. Then any multiplication call could be translated into a 2-parameter lookup call from the database. How long could that take?

  • There was algorithm A1 that has complexity n^2.
    There is now algorithm A2 that has complexity n.log(n).
    He says that A2 is worse than A1.

    a) that does not compute. Are there some constants at play, that make O(n)=n.log(n) worse than O(n)=n^2 ?
    b) while this discovery may deepen our understanding of math, he in the same breath says, that he has found not boundary where his discovery is actually useful? I.e. his discovery is not useful for anything real?
    c) He says "if the numbers keep rising into the trillions"

    • He uses python and suddenly it's O(1), for him.
    • by pjt33 ( 739471 )

      a) that does not compute. Are there some constants at play, that make O(n)=n.log(n) worse than O(n)=n^2 ?

      Correct.

      b) while this discovery may deepen our understanding of math, he in the same breath says, that he has found not boundary where his discovery is actually useful? I.e. his discovery is not useful for anything real?

      Correct.

      c) He says "if the numbers keep rising into the trillions" his algorithm is better. So he _did_ found scenarios where his discovery is useful? Doesn't that contradict b) ?

      If we tu

We are Microsoft. Unix is irrelevant. Openness is futile. Prepare to be assimilated.

Working...