Please create an account to participate in the Slashdot moderation system


Forgot your password?
Math Encryption Security Science

A Mighty Number Falls 348

space_in_your_face writes "An international team has broken a long-standing record in an impressive feat of calculation. On March 6, computer clusters from three institutions (the EPFL, the University of Bonn, and NTT in Japan) reached the end of eleven months of strenuous calculation, churning out the prime factors of a well-known, hard-to-factor number — 2^1039 - 1 — that is 307 digits long." The lead researcher believes "the writing is on the wall" for 1024-bit encryption. "Last time, it took nine years for us to generalize from a special to a non-special hard-to factor number (155 digits). I won't make predictions, but let's just say it might be a good idea to stay tuned."
This discussion has been archived. No new comments can be posted.

A Mighty Number Falls

Comments Filter:
  • That's not even the point. The algorithm used to factor 2^k - 1, is generally the SNFS which is a highly optimized variant of the NFS, even faster than the GNFS. To factor RSA numbers you need the GNFS.

    That said, not all 1024-bit numbers are hard to factor, in fact you have about a 1 in 300 chance of pulling 1024-bit prime out of your ass. The trick here is that RSA numbers are random and have less algebraic structure than Mersenne numbers.

    Of course, with all that said, people should be using ECC anyways.

  • -1 author stupidity (Score:5, Informative)

    by tomstdenis ( 446163 ) < minus poet> on Tuesday May 22, 2007 @02:41PM (#19225129) Homepage
    SNFS != GNFS. Factoring specific 1024-bit numbers of that form isn't always super hard.

    That they pulled off a SNFS on a 1024 bit number is cool, but not the same amount of work for a GNFS against an 1024-bit RSA key.

  • Re:What are they? (Score:5, Informative)

    by Anonymous Coward on Tuesday May 22, 2007 @02:41PM (#19225147)
    1159420574 0725730643698071 48876894640753899791 70201772498686835353882248385
    9966756608 0006095408005179 47205399326123020487 44028604353028619141014409345
    3512334712 7396798885022630 75752809379166028555 10550042581077117617761009413
    7970787973 8061870084377771 86828680889844712822 00293520180607475545154137071


    5585366661 9936291260 7492046583 1594496864
    6527018488 6376480100 5234631985 3288374753
    2075818194 6442382764 5704813703 5946951629
    3970800739 5209881208 3870379272 9090324679
    3823431438 8414483488 2534053344 7691122230
    2815832769 6525376091 4101891052 4199389933
    4109711624 3589620659 7216748116 1749004803
    6597355734 0925320542 5523689

    (spaces added because of lameness filter)
  • Rather than just digesting using some key, It seems to me that you could set up two 'encryption' agents which talk to each other and form a random proprietary "language" that only each other can understand.

    You mean, like generating a analogous OTP out of a pseudo-random number generator? Not only has that been done before [], but you're still left with a key: The seed which produced the pseudo-random sequence.

    The Navajo code-talkers worked because the encoding was extremely obscure (security through obscurity at its finest!) and cryptography was still in its infancy. I sincerely doubt that the Navajo codes would stand up to a modern cryptographic analysis. []
  • by shofutex ( 986330 ) on Tuesday May 22, 2007 @02:47PM (#19225243)
    Adi Shamir designed one already. [] Instead of 11 months, it takes 12--but it could (in theory) factor any 1024-bit number.
  • by Palmyst ( 1065142 ) on Tuesday May 22, 2007 @02:57PM (#19225441)
    Yes, The RSA Algorithm for public key encryption [] is based on the difficulty of factoring very large numbers. The key size is the number of bits in the number that has to be factored to break the encryption. Many of the modern security systems, including Verisign certificates for secure websites are based on RSA encryption and 1024 is a very common key size in use. Thus the ease of factoring 1024 bit numbers would indeed be a matter of concern.

    RSA 101.
  • Re:What are they? (Score:3, Informative)

    by fbjon ( 692006 ) on Tuesday May 22, 2007 @02:59PM (#19225465) Homepage Journal
    If you mean 0x09F911029D74E35BD84156C5635688C0, it's not very difficult to factorise actually.
  • Re:What are they? (Score:2, Informative)

    by Anonymous Coward on Tuesday May 22, 2007 @03:01PM (#19225501)

    1159420574 0725730643698071 48876894640753899791 70201772498686835353882248385
    9966756608 0006095408005179 47205399326123020487 44028604353028619141014409345
    3512334712 7396798885022630 75752809379166028555 10550042581077117617761009413
    7970787973 8061870084377771 86828680889844712822 00293520180607475545154137071
    Um, no it's not - that's somewhere between 2^1016 and 2^1017. Your factorisation is otherwise correct, but these aren't the numbers we're looking for.
  • Re:What are they? (Score:3, Informative)

    by Hatta ( 162192 ) on Tuesday May 22, 2007 @03:03PM (#19225519) Journal
    Odd, the factors you give do multiply to give the product you say, but according to bc: 2^1039-1=

    58906808643168367664473872491774762471193869645981 501775357568993765\
    84320794655559932591384900650140340063891615625817 543763223144510803\
    88584562460719428810761069833174599222153387113189 363201210623862217\
    39214690332885215589978237001371848062018269073686 695341125238207265\
  • by Palmyst ( 1065142 ) on Tuesday May 22, 2007 @03:03PM (#19225525)
    From rld_record_fo.html [] Using the sieve program developed at the University of Bonn, NTT, EPFL, and the University of Bonn respectively provided 84.1 percent, 8.3 percent, and 7.6 percent of the calculation resources, and the calculation amount equivalent to 95 years of operation on a 3-Ghz Pentium D. PC clusters at NTT and EPFL, consisting of 110 and 36 PCs, respectively, were run in parallel for more than two months for the calculations. The results were 47 non-trivial solutions of the simultaneous equations defined by an approximate 70,000,000 x 70,000,000 large sparse linear matrix.
  • Re:What are they? (Score:1, Informative)

    by Anonymous Coward on Tuesday May 22, 2007 @03:07PM (#19225579)
    you are right, this is actually the cofactor that has been found to 2^1039, I apologize =)
  • by DemonThing ( 745994 ) <> on Tuesday May 22, 2007 @03:09PM (#19225623)
    There are actually three prime factors; the two you listed, and the small factor 5080711. Thus:

    2^1039-1 = 5080711 * 55853666619936291260749204658315944968646527018488 637648010052346319853288374753 * 20758181946442382764570481370359469516293970800739 52098812083870379272909032467938234314388414483488 25340533447691122230281583276965253760914101891052 41993899334109711624358962065972167481161749004803 659735573409253205425523689

    is the correct factorization, as can be readily verified.

    Also: []
  • fixed link (Score:3, Informative)

    by goombah99 ( 560566 ) on Tuesday May 22, 2007 @03:24PM (#19225883)
    oops here's a working link []
  • another good link (Score:3, Informative)

    by goombah99 ( 560566 ) on Tuesday May 22, 2007 @03:27PM (#19225931)
    Twinkle, an older version of twirl, has a better wikipedia entry []
  • Re:Quadruple AES? (Score:4, Informative)

    by Meostro ( 788797 ) on Tuesday May 22, 2007 @03:34PM (#19226031) Homepage Journal
    It depends entirely on how you're doing your QAES.

    The standard 3DES process is 3DES-EDE which uses 2 keys, thus giving you 112 bits.
    ENCRYPT data with key1
    DECRYPT output with key2
    ENCRYPT output with key1

    Since DES is symmetric, any paired combination of encrypt and decrypt will give you the same result. You can do E(D(data)) to get your result, or you can use D(E(data)) for the same thing. If you used the same key for key1 and key2, this would be the same as doing regular DES, and would just take 3x as long.

    If you used three different keys for your 3DES instead, you would have the 168-bit key length. Thus, you can apply the same concept to 4AES, and depending on which way you do it you will end up with 256-, 512-, 768- or 1024-bit key strength.
  • Re:What are they? (Score:3, Informative)

    by Deanalator ( 806515 ) <> on Tuesday May 22, 2007 @03:45PM (#19226227) Homepage
    Someone correct me if I'm wrong, but isn't one of those factors only 80 digits long? (80/3)*10~= about 267 bits. From what I understand, factoring a number is just as complex as pulling out the smallest factor. That would make this feat roughly equal to factoring the RSA 512, which was done a few years back. 1024 bit RSA uses two 512 bit primes. This is significantly harder than what these guys have done.
  • by YA_Python_dev ( 885173 ) on Tuesday May 22, 2007 @04:00PM (#19226447) Journal

    I don't know where she/he got her/his number, but it's wrong.

    Use Python, Luke:

    Python 2.5.1 (r251:54863, Apr 19 2007, 19:11:47) [GCC 3.3.1] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> print 2**1039-1
    589068086431683676644738724917747624711 93869645981501775357568993765843207946555599325913 84900650140340063891615625817543763223144510803885 84562460719428810761069833174599222153387113189363 20121062386221739214690332885215589978237001371848 06201826907368669534112523820726591354912103343876 844956209126576528293887
    >>> len(str(2**1039-1))

    So the summary is wrong too: the number is 313, not 307 digits long. At least in base 10.

  • by realscrappy ( 1105719 ) on Tuesday May 22, 2007 @04:14PM (#19226655)
    I see 4 spaces plus two newlines and 307+4+2 = 313.
  • Re:What are they? (Score:2, Informative)

    by fatphil ( 181876 ) on Tuesday May 22, 2007 @04:15PM (#19226669) Homepage
    You are missing the fact that there are two types of factoring algorithms.

    Firstly there are factor-finding algorithms, whose difficulty is dependent on properties of the individual factors of the composite. Examples of these are trial-division, Pollard/Brent Rho, Pollard P-1, someone else's P+1, and Lenstra's ECM algorithms.

    Secondly there are splitting algorithms, whose difficulty is dependent on the properties of the composite itself, not its factors.

    The difficulty of finding an 80-digit factor using the first type of algorithm is more or less impossibility. It's never been done. The record is something like 70 digits just last year. Only about half a dozen people have ever even found 60+ digit factors.

    So with that in mind, when you're pretty sure there are no <60-digit factors (by running Lenstra's ECM many thousand times), you have pretty much no option but hitting the second type of algorithm.

    (If you're looking at a 100-digit composite, you would only run ECM until you were fairly sure there were no <30-digit factors, if looking at a 130-digit composite, you'd run ECM until sure there were no 40-digit factors. Approximately - I pulled those figures out of my arse.)
  • by kju ( 327 ) on Tuesday May 22, 2007 @04:29PM (#19226919)
    Even worse: When your key can be cracked in 10 years, someone can create false signatures in your name dated 10 years back. Think about long-running contracts etc....

    We have in germany some really brain-fucked law about the requirement of digital signatures (s/mime based) on electronic invoices, but one idea they actually got right: You will get an invoice which is signed by the vendor. If you are required to keep incoming invoices (businesses) every once in a while you need to take the current file and sign it again with your own (current) key. So document+signature becomes (document+signature)+signature, then (((document+signature)+signature)+signature. So you will sign repeatedly an older signature with your newer key. This builds a chain of signatures and still proves integrity of the document and the signature when the original signature key length is long broken required you have done this "renew" regularly.
  • by Michael Woodhams ( 112247 ) on Tuesday May 22, 2007 @05:01PM (#19227589) Journal
    I've put these into Mathematica and confirmed they are correct.
  • by trifish ( 826353 ) on Tuesday May 22, 2007 @05:05PM (#19227647)
    The lead researcher believes "the writing is on the wall" for 1024-bit encryption.

    It should be noted (before people start panicking) that the 1024-bit key size refers to asymmetric or public-key cryptography (software like PGP, GPG, algorithms like DH/DSS, RSA), not to symmetric cryptography (software like TrueCrypt, algorithms like AES, Blowfish, Twofish, etc.).

    A 256-bit symmetric key is completely infeasible to brute force and even if quantum computers become available, the complexity of brute force attack will still be 2^128 (which is again infeasible to brute force).
  • The reason 3DES provides an effective key length of 112 bits, not 168, isn't because only two keys are used instead of three. We only bother using two keys because the effective length of three-key 3DES is still only 112 bits, so there's little reason to bother storing and managing a third.

    The reason the effective length is only 112 bits is something called the "Meet in The Middle" attack. Suppose three keys were used and that the attacker has plaintext and ciphertext to mount a known-plaintext attack. An attacker can apply the first encryption step to the plaintext message using all possible 56-bit keys and then store the results in a big dictionary. Then, the attacker picks a 112-bit key and performs the first two decryption steps on the ciphertext. If the result is in the dictionary, then the attacker has probably found all three keys. If not, he picks another 112-bit key and tries again. So the attacker's work is (a) the effort required to create the dictionary plus (b) the effort required to brute force search a 112-bit keyspace. Since (b) completely dominates (a) we can ignore (a) and use (b) as our estimate of the attack complexity.

    In the case of any quadruple encryption, then, the Meet in the Middle attack would require building a dictionary of all possible encryptions using the first two keys, then brute forcing the space of the last two keys. So, the effective strength is equivalent to the size of two of the four keys. Quintuple encryption is equivalent to three keys. Double encryption is equivalent to one key, which is why 2DES was never used.

    What does all of this have to do with 1024-bit RSA keys? Not a thing. 1024-bit RSA keys consist of numbers that are the product of two 512-bit prime numbers. That means they're pretty sparse among the set of all possible 1024-bit numbers, and it means they have a particular mathematical structure that can be exploited.

    Symmetric ciphers, like AES, are different. Unless there's something wrong with them, their keyspaces are flat, meaning that if they use n-bit keys, every possible n-bit value is a legitimate key. They have no particular mathematical properties, and there is no way to identify the right one except by trying them all.

    So, assuming that AES doesn't have some weakness that allows attacks faster than brute force to succeed, how long until we need to use keys bigger than 256 bits?

    Forever, basically. Barring weaknesses in the algorithm, 256-bit symmetric keys are safe until, as Bruce Schneier put it "computers are built from something other than matter and occupy something other than space."

    In Applied Cryptography he outlines an interesting little computation to demonstrate why this is. Suppose you had a computer that contained a 256-bit register that was maximally efficient, meaning that toggling a bit required exactly one quantum of energy. Since smaller units of energy don't exist, you can't do better than that[*]. With that assumption, you can calculate how much energy it would take to cycle your 256-bit counter through all possible states. Schneier calculates that if you could capture all of the energy from a typical supernova and run your counter on that, you could count from 0 all the way up through about 2^219. So you'd need about 130 billion supernovas to run your counter through all of its 2^256 possible states.

    That completely ignores the energy you'd need to perform a trial encryption with each of those values, and it also completely ignores how long it would take to perform all of these operations.

    Quantum computers that can somehow handle the complex structures of symmetric ciphers, or some other radical change in computing technology would be required to make 256-bit keys accessible to brute force. A flaw in AES is far more likely, IMO.

    [*] Just because someone will call me on it, I should point out that reversible computing means that in theory you might be able to do better than the theorized maximally-efficient computer. In practice, that probably isn't going to make your energy budget small enough to be reachable, and it certainly isn't going to help with the time factor.

  • by Michael Woodhams ( 112247 ) on Tuesday May 22, 2007 @07:36PM (#19229909) Journal
    Interestingly, the first factor is quite small, and trivially easy to find. The following Mathematica code finds it in less then 3.5 seconds on my 4 year old computer:

    With[{x = 2^1039 - 1}, Prime[Select[Range[1, 360000], (Mod[x, Prime[#]] == 0) &]]]

    Finding the next factor like this (by trial division) should take a mere 10^70 or so times longer.

  • by blank axolotl ( 917736 ) on Tuesday May 22, 2007 @07:44PM (#19230005)
    nope, your parent is right. The number starts 589068086... and is 313 digits. The spaces are from slashdot filtering. A post below gives the correct 3 factors, which give 589068086... when multiplied.
  • by Anonymous Coward on Tuesday May 22, 2007 @09:45PM (#19231017)
    The 307 digit number is the correct target of the story. It is one of two cofactors of 2^1039 - 1. The other cofactor, 5080711, was found trivially. So the challenge of factoring the 313 digit number became the challenge of factoring the 307 digit number.
  • by Anonymous Coward on Wednesday May 23, 2007 @07:06AM (#19234049)
    If you know in advance that you're going to start a war, you can do certain things during peacetime that are suspicious but not actually acts of war. For example, hire away the smartest upcoming new scientists in fields which have potential military applications. If they try to return to their home country after the war starts you can imprison them as traitors (don't execute them, that's free propaganda for your enemy), if they remain of their own free will you can use them for your own research programs, either way you deny their talent to your opponent. Or, as in this case, send legitimate researchers to discover things that might be difficult to find out once war is underway. In 1935 a German visitor to England could easily have discovered the location of important military bases, power plants, or heavy industry, drawn them on a map and returned without arousing suspicion. A few years later this would be cause for arrest and imprisonment or perhaps execution on charges of spying.

    After WWI Germany was forbidden by treaty from creating an Air Force, because it was known that air support would be critical in any future war. So the government encouraged its people to learn to fly gliders and other primitive aircraft as an apparently harmless hobby. Of course this training significantly reduced the time needed to create an Air Force once Germany began to ignore the terms of its treaty. On the other hand they made the mistake of outlawing amateur radio, fearing that it would be useful to anti-government activists and, later, to the resistance in occupied countries. This meant that Germany was constantly short of skilled radio people, forcing them to expend more resources on "idiot proof" radar and communications hardware which could be maintained with minimal training and no radio knowledge, whereas Britain was able to easily find experienced amateurs who could operate and maintain their more primitive equipment.
  • Um, AES is elliptic curve? News to me...

    For christsakes, at least read the list; I linked to it. And I did say only the public key algorithms, so AES isn't even relevant.

    NSA Suite B:

    * Advanced Encryption Standard (AES) with keys sizes of 128 and 256 bits -- symmetric encryption
    * Secure Hash Algorithm (SHA-256 and SHA-384) -- message digest
    * Elliptic-Curve Menezes-Qu-Vanstone (ECMQV) -- key agreement
    * Elliptic-Curve Diffie-Hellman (ECDH) -- key agreement
    * Elliptic-Curve Digital Signature Algorithm (ECDSA) -- digital signatures

    What I said:
    "...all the PK [public key] systems are based on elliptic curves, and not prime factorization, for the trapdoor function."

    Of the algorithms in Suite B, AES and SHA aren't public-key algorithms; they're a symmetric block cipher and a hash function. The three relevant PK algorithms are ECMQV, ECDH, and ECDSA, and all of those are specifically noted as being "elliptic curve" variants, rather than the more common RSA-style prime-factorization-based algorithms.

    PK algorithms which use elliptic curves use an entirely different set of mathematical functions as the basis for their 'trapdoor' or 'puzzle' (the function that's easy to compute but difficult to run backwards) from RSA. They're based on a variation of the discrete logarithm problem []. (From what I understand, the purest form of the discrete logarithm problem isn't reversible at all -- you can run it in one direction, but from the output you can't figure out all of the input parameters were with certainty -- so specific variations of the general problem are used, of which elliptic curves are one.)

    Given the popularity of RSA, I think its absence from the list is notable at the very least, and it's furthermore interesting that the NSA seems to really like elliptic curve systems as a basis for PK crypto. At least according to Wikipedia, nobody has ever published a proof of the mathematical hardness of elliptic curve systems...maybe they're even better than is currently realized. (Although, the real tinfoil hat response is, 'maybe they're really flawed somehow, and that's why the NSA wants you to use them!' However, I think this is doubtful for any number of reasons.)

10.0 times 0.1 is hardly ever 1.0.