Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Math Encryption Security Science

A Mighty Number Falls 348

space_in_your_face writes "An international team has broken a long-standing record in an impressive feat of calculation. On March 6, computer clusters from three institutions (the EPFL, the University of Bonn, and NTT in Japan) reached the end of eleven months of strenuous calculation, churning out the prime factors of a well-known, hard-to-factor number — 2^1039 - 1 — that is 307 digits long." The lead researcher believes "the writing is on the wall" for 1024-bit encryption. "Last time, it took nine years for us to generalize from a special to a non-special hard-to factor number (155 digits). I won't make predictions, but let's just say it might be a good idea to stay tuned."
This discussion has been archived. No new comments can be posted.

A Mighty Number Falls

Comments Filter:
  • by Hatta ( 162192 ) on Tuesday May 22, 2007 @01:32PM (#19224979) Journal
    I read TFA, it didn't say what the factors were. Does anyone know?
    • by jfengel ( 409917 ) on Tuesday May 22, 2007 @01:36PM (#19225041) Homepage Journal
      Hang on, I'm working on it. I'll get back to you.
    • by IthnkImParanoid ( 410494 ) on Tuesday May 22, 2007 @01:37PM (#19225071)
      They were about to write them down when the computer was destroyed to make way for a hyperspace bypass. I guess we'll find out in 11 months or so.

      On the plus side, the staff has quicker access to the nearest janitorial supply closet.
    • Re:What are they? (Score:5, Informative)

      by Anonymous Coward on Tuesday May 22, 2007 @01:41PM (#19225147)
      2^1039-1=
      1159420574 0725730643698071 48876894640753899791 70201772498686835353882248385
      9966756608 0006095408005179 47205399326123020487 44028604353028619141014409345
      3512334712 7396798885022630 75752809379166028555 10550042581077117617761009413
      7970787973 8061870084377771 86828680889844712822 00293520180607475545154137071
      1023817

      factors:

      5585366661 9936291260 7492046583 1594496864
      6527018488 6376480100 5234631985 3288374753
      ×
      2075818194 6442382764 5704813703 5946951629
      3970800739 5209881208 3870379272 9090324679
      3823431438 8414483488 2534053344 7691122230
      2815832769 6525376091 4101891052 4199389933
      4109711624 3589620659 7216748116 1749004803
      6597355734 0925320542 5523689

      (spaces added because of lameness filter)
      • by brunascle ( 994197 ) on Tuesday May 22, 2007 @01:51PM (#19225311)
        for the love of god, please tell me you got those numbers from the results of the project
      • Re: (Score:2, Informative)

        by Anonymous Coward

        2^1039-1=
        1159420574 0725730643698071 48876894640753899791 70201772498686835353882248385
        9966756608 0006095408005179 47205399326123020487 44028604353028619141014409345
        3512334712 7396798885022630 75752809379166028555 10550042581077117617761009413
        7970787973 8061870084377771 86828680889844712822 00293520180607475545154137071
        1023817

        Um, no it's not - that's somewhere between 2^1016 and 2^1017. Your factorisation is otherwise correct, but these aren't the numbers we're looking for.

        • I would have to say that python agrees with you...

          >>> 2**1039-1
          58906 80864 31683 67664 47387 24917 74762 47119
          38696 45981 50177 53575 68993 76584 32079 46555
          59932 59138 49006 50140 34006 38916 15625 81754
          37632 23144 51080 38858 45624 60719 42881 07610
          69833 17459 92221 53387 11318 93632 01210 62386
          22173 92146 90332 88521 55899 78237 00137 18480
          62018 26907 36866 95341 12523 82072 65913 54912
          10334 38768 44956 20912 65765 28293 887

          however this is 313 not 307 digits as stated in the article... (8 rows w
      • Re: (Score:3, Informative)

        by Hatta ( 162192 )
        Odd, the factors you give do multiply to give the product you say, but according to bc: 2^1039-1=

        58906808643168367664473872491774762471193869645981 501775357568993765\
        84320794655559932591384900650140340063891615625817 543763223144510803\
        88584562460719428810761069833174599222153387113189 363201210623862217\
        39214690332885215589978237001371848062018269073686 695341125238207265\
        91354912103343876844956209126576528293887
      • by DemonThing ( 745994 ) <demonthing@NOspaM.gmail.com> on Tuesday May 22, 2007 @02:09PM (#19225623)
        There are actually three prime factors; the two you listed, and the small factor 5080711. Thus:

        2^1039-1 = 5080711 * 55853666619936291260749204658315944968646527018488 637648010052346319853288374753 * 20758181946442382764570481370359469516293970800739 52098812083870379272909032467938234314388414483488 25340533447691122230281583276965253760914101891052 41993899334109711624358962065972167481161749004803 659735573409253205425523689

        is the correct factorization, as can be readily verified.

        Also:
        http://www.heise.de/english/newsticker/news/90031 [heise.de]
        • Re: (Score:3, Informative)

          I've put these into Mathematica and confirmed they are correct.
        • Re: (Score:3, Informative)

          Interestingly, the first factor is quite small, and trivially easy to find. The following Mathematica code finds it in less then 3.5 seconds on my 4 year old computer:

          With[{x = 2^1039 - 1}, Prime[Select[Range[1, 360000], (Mod[x, Prime[#]] == 0) &]]]

          Finding the next factor like this (by trial division) should take a mere 10^70 or so times longer.

      • by Excors ( 807434 )

        You missed one:

        5080711
        x
        5585366661 9936291260 7492046583 1594496864
        6527018488 6376480100 5234631985 3288374753
        ×
        2075818194 6442382764 5704813703 5946951629
        3970800739 5209881208 3870379272 9090324679
        3823431438 8414483488 2534053344 7691122230
        2815832769 6525376091 4101891052 4199389933
        4109711624 3589620659 7216748116 1749004803
        6597355734 0925320542 5523689
        = 2^1039-1

      • by mutende ( 13564 )

        2^1039-1=
        1159420574 0725730643698071 48876894640753899791 70201772498686835353882248385
        9966756608 0006095408005179 47205399326123020487 44028604353028619141014409345
        3512334712 7396798885022630 75752809379166028555 10550042581077117617761009413
        7970787973 8061870084377771 86828680889844712822 00293520180607475545154137071
        1023817
        Which has 313 digits, not 307.
      • Re: (Score:3, Informative)

        by Deanalator ( 806515 )
        Someone correct me if I'm wrong, but isn't one of those factors only 80 digits long? (80/3)*10~= about 267 bits. From what I understand, factoring a number is just as complex as pulling out the smallest factor. That would make this feat roughly equal to factoring the RSA 512, which was done a few years back. 1024 bit RSA uses two 512 bit primes. This is significantly harder than what these guys have done.
      • Re: (Score:3, Funny)

        by phasm42 ( 588479 )
        In binary: 2^1039-1=11111111111...11111111 (1039 '1' bits)
    • by VAXcat ( 674775 ) on Tuesday May 22, 2007 @01:43PM (#19225169)
      I know them, but I can't tell you, since they are also copyrighted AACS keys...
      • Re: (Score:3, Informative)

        by fbjon ( 692006 )
        If you mean 0x09F911029D74E35BD84156C5635688C0, it's not very difficult to factorise actually.
  • that was used for this? I don't think I have to worry about the usability of 1024bit encryption for a while yet.
    • That's not even the point. The algorithm used to factor 2^k - 1, is generally the SNFS which is a highly optimized variant of the NFS, even faster than the GNFS. To factor RSA numbers you need the GNFS.

      That said, not all 1024-bit numbers are hard to factor, in fact you have about a 1 in 300 chance of pulling 1024-bit prime out of your ass. The trick here is that RSA numbers are random and have less algebraic structure than Mersenne numbers.

      Of course, with all that said, people should be using ECC anyways.

      Tom
    • by Sparr0 ( 451780 )
      Two years from now? Almost everyone. :)
    • by Anonymous Cowpat ( 788193 ) on Tuesday May 22, 2007 @01:41PM (#19225135) Journal
      governments. Who, incidentally, are the prime targets for using encryption against.
      • Re: (Score:3, Insightful)

        by nasor ( 690345 )
        Then I suppose I won't expect my 1024 bit encryption to keep my data safe from the NSA, in the same way that I won't expect my home alarm system to protect me from a strike team of navy SEALS.
      • Re: (Score:3, Insightful)

        by ArsonSmith ( 13997 )
        Wow, I thought the prime target to use encryption against where the people trying to break into my bank account. You learn something every day.
    • All you have to do is set up something like distributed.net and you can crank through pretty fast. If hackers can infect millions of systems for massive DDOS attacks I think they could probably create a massive distributed computing platform. Vista will only make things easier because it forces a powerful video card on every system. If the distributed network can harness those video cards for number crunching they'll be a lot faster than networks running on just the CPU.
      • by CastrTroy ( 595695 ) on Tuesday May 22, 2007 @02:00PM (#19225475)
        But with this kind of computation time, you just have to send lots of junk traffic to make them waste all their computing resources. If you send out 500 messages a day, only 1 of which has actual usable information in it, then they are going to be wasting a lot of computing resources just to find out which messages actually have usable information. With computation times this high, it would be easy to flood them with data so that they wouldn't have enough time to decrypt everything.
        • Re: (Score:3, Funny)

          So that's what the spammers are doing. Does that mean that 1/500 v1agra messages is really sekret US intelligence?
          • by CastrTroy ( 595695 ) on Tuesday May 22, 2007 @02:22PM (#19225845)
            Really it's not that bad of an idea. Create something that looks like image spam. Hide the encrypted information using stenography in the image, and send it out to millions of people, including the intended recipient. Everybody except the intended recipient deletes the message. It makes it harder to track down who you are communicating with, and harder to find out which messages actually contain useful information. It's similar to in olden days when they used to put a secret message in the classifieds of the newspaper. Only the people who know that it was supposed to be there could actually get the hidden message, but it was there for everyone to see.
            • This has already been done as early as 10 years ago.

              I was working in Eastern Europe on a now unclassified project, working against a low budget illegal foreign intelligence agency. They were selling and distributing porn CD's and DVD's with thousands of pictures, one or more of which would contain an encrypted stenographic message. Their contact would purchase the DVD at one of hundreds of little markets, and decrypt the proper image(s).

              It was really quite a good plan. Not only were there many possible valid messages to one or more agents, but there were also an unknown number of false messages, they even may have even been all false messages that could only be put together by inference. However, since they were encrypted with PGP, we never were able to break that particular system before I left the project.

              The real genius of the plan was that it brought them in some much needed cash as well.

  • by Raul654 ( 453029 ) on Tuesday May 22, 2007 @01:34PM (#19225009) Homepage
    For an embarrassingly parallel, strictly integer application like this, I think the logical next step is to attack it with FPGAs. For such an application, it wouldn't surprise me if a large Alterera FPGA could give you at least the same computation power as a large cluster, for a fraction of the price (both for the hardware and the electricity to power the thing).
  • Security (Score:3, Insightful)

    by morgan_greywolf ( 835522 ) * on Tuesday May 22, 2007 @01:35PM (#19225021) Homepage Journal
    "Security is about risk management. If you have something to protect that's valuable enough for someone to steal, and the only protection you have on it is 1,024-bit crypto, you deserve to have it stolen." -- Forgot who said it, but it was on /.
  • I can see the RIAA filing the lawsuit on a DMCA violation now....."That's our prime number/integer"
  • I understand that they'll be able to crack 1024, but still, 3 years to see my e-mails. It's not worth it for them. Now when they got it down to 3 hours I'll be worried, but by then we'll probably be using 4096.
    • I understand that they'll be able to crack 1024, but still, 3 years to see my e-mails. It's not worth it for them. Now when they got it down to 3 hours I'll be worried, but by then we'll probably be using 4096.

      True, but what you need to think about is forward secrecy.

      There are lots of things being transmitted today that are still going to be in use three years from now. For example, think of financial information: if you use an encryption standard that's acceptable right now, but can be broken in three years (or, is trivially breakable in three years due to increases in computer power or techniques), then you're in trouble, because some of that information is still going to be sensitive/valuable in three years. The fact that you'll be using 4096 bits then doesn't matter, if someone grabs it now and crunches on it for a while. Same with identification numbers (SSNs, etc); if I grab a batch of numbers today, most of them will probably still be good in ten or fifteen years, and some of them will still be good in 30 or 40. That's how far out you need to be thinking when choosing an encryption standard for that data.

      There are some things where only immediate security matters (transmitting big session keys that get thrown away a few hours or minutes later), but many other things -- and I think general file encryption falls into this category -- where it's hard to predict for how long the encrypted information might be sensitive or valuable.
      • Re: (Score:3, Informative)

        by kju ( 327 )
        Even worse: When your key can be cracked in 10 years, someone can create false signatures in your name dated 10 years back. Think about long-running contracts etc....

        We have in germany some really brain-fucked law about the requirement of digital signatures (s/mime based) on electronic invoices, but one idea they actually got right: You will get an invoice which is signed by the vendor. If you are required to keep incoming invoices (businesses) every once in a while you need to take the current file and sig
    • by rworne ( 538610 )
      Actually, one should be worried when the length of time it takes to crack it falls within a time span that is less than the statute of limitations.
  • Rather than just digesting using some key, It seems to me that you could set up two 'encryption' agents which talk to each other and form a random proprietary "language" that only each other can understand. This would be very much like a one time pad [wikipedia.org] - which is basically the only truly unbreakable encryption:

    Code Talkers.

    The Navajo language basically served as a one time pad in WWII - why not use programs which generate their own method of communication (their own "language") for use in transmitting inform
    • uh huh. Right. Let's see you write a paper and an example implimentation of that. Good luck.
    • Rather than just digesting using some key, It seems to me that you could set up two 'encryption' agents which talk to each other and form a random proprietary "language" that only each other can understand.

      You mean, like generating a analogous OTP out of a pseudo-random number generator? Not only has that been done before [wikipedia.org], but you're still left with a key: The seed which produced the pseudo-random sequence.

      The Navajo code-talkers worked because the encoding was extremely obscure (security through obscurity at its finest!) and cryptography was still in its infancy. I sincerely doubt that the Navajo codes would stand up to a modern cryptographic analysis.

      http://en.wikipedia.org/wiki/Navajo_Code_Talkers [wikipedia.org]
    • Isn't this the way some cryptography systems work? Using Diffie Helman key exchange to decide a secret key. Assuming nobody knows the key is what makes it secure. Just like in WWII, they assumed the enemy didn't understand Navajo. I'm not sure what kind of computing would be necessary for the computers to agree on a decryption/encryption language. They'd probably have a set list of ciphers that they both supported. I don't think there's any way to create strong ciphers on the fly. Another problem is h
    • You simply could not crack it unless you already knew the information being sent.

      Perfectly secure methods (one time pad) are perfectly secure because even if you have the cryptotext and the plaintext, the probably that the cryptotext is the plaintext is the same for all plaintexts if you don't know the key (e.g. if you knew the cryptotext is one of two plaintexts, the probability that it was one or the other is 0.5 regardless of what you know about the algorithm).

      The Navajo language is an example of sec [wikipedia.org]

    • If I understand correctly this would be security via obscurity. All you'd have to do is learn the language and emulate it.
    • by wfberg ( 24378 ) on Tuesday May 22, 2007 @02:20PM (#19225791)
      The Navajo language basically served as a one time pad in WWII

      No, they served as code-talkers. A one-time pad is a system whereby every bit of the encryption key is independent of the others (never reused, unlike codewords) and entropy is maximal. Simply translating stuff from one word to another is simple substitution, a simple code.

      The reason Navajo Code Talkers were succesful wasn't because the scheme was particularly advanced. In fact, it would have been computationally trivial to break. However the messages relayed were only ever "tactical" in nature; i.e. communications in the field, of use during a fight, but old news in about 10 minutes. Had Navajo code talking been used to relay top-secret messages, it would have been broken fairly quickly. The reason for its success was that is was extremely cheap to implement for the US, and the secrets protected weren't valuable enough to spend huge effort on breaking. Economics, rather than mathematics.

      Navajo wasn't used in Europe, because Germany had sent anthropologists to the US to learn native languages, anticipating precisely this scheme.
  • by Anonymous Coward on Tuesday May 22, 2007 @01:37PM (#19225075)
    NSA research indicates that 1024-bit encryption is unbreakable and everyone should be using it.
  • this too (Score:5, Funny)

    by Himring ( 646324 ) on Tuesday May 22, 2007 @01:38PM (#19225077) Homepage Journal
    Knowing this, too, will not help you pick up chicks in a bar....

  • by JohnA ( 131062 ) <johnanderson.gmail@com> on Tuesday May 22, 2007 @01:40PM (#19225119) Homepage
    ...is that most Certificate Authorities who have trusted certs in the major browsers / e-mail programs will NOT sign a certificate for any keysize greater than 1024 bits.

    This artificial limitation is going to become more and more glaringly obvious as time goes on.
    • by Kadin2048 ( 468275 ) * <slashdot...kadin@@@xoxy...net> on Tuesday May 22, 2007 @02:08PM (#19225589) Homepage Journal
      I hate to be the guy who pulls out the tinfoil, but why not.

      A few weeks ago I was reading Steven Levy's Crypto (not a bad book, although a little out-of-date now, but it brings back the dot-com nostalgia), in which he spends a lot of time describing the NSA's objections to strong civilian crypto in the U.S. in the 80s and early 90s. They went from absolutely opposing civilian crypto (particularly public-key stuff) tooth and nail, to suddenly just throwing in the towel. While I'm sure that much of that was just political pragmatism -- with the Cold War over, they were having a harder and harder time maintaining their objections in the face of 'progress' (in the form of a lot of pressure on Congress from business and the tech sector) -- but I can't help but wondering if they didn't figure something out that made them withdraw their objections to bigger key sizes.

      Particularly since it's now known that some people on the government side knew about public-key crypto before it became public (the early-70s GCHQ paper, and I find it hard to believe that at its peak during the Cold War, someone at the NSA didn't find the same thing), they've had a long time to work on the problem -- though it's possible that they just totally squandered whatever lead they had, and are now at the same point that the unclassified world is, that just seems unlikely to me.
  • -1 author stupidity (Score:5, Informative)

    by tomstdenis ( 446163 ) <tomstdenis.gmail@com> on Tuesday May 22, 2007 @01:41PM (#19225129) Homepage
    SNFS != GNFS. Factoring specific 1024-bit numbers of that form isn't always super hard.

    That they pulled off a SNFS on a 1024 bit number is cool, but not the same amount of work for a GNFS against an 1024-bit RSA key.

    Tom
  • on the wall, eh? (Score:4, Insightful)

    by Lord Ender ( 156273 ) on Tuesday May 22, 2007 @01:43PM (#19225167) Homepage
    Considering RSA Inc. sells X.509 token/smart card devices which support ONLY 1024-bit keys, I don't think it's going anywhere for a while.
  • One down, (Score:4, Funny)

    by seaturnip ( 1068078 ) on Tuesday May 22, 2007 @01:45PM (#19225203)
    infinity left to go!
  • But what about the other forms of public key encryption? Wikipedia also lists Diffe-Hellman, ElGammel, Eliptic Curve and others.
  • by blantonl ( 784786 ) on Tuesday May 22, 2007 @01:48PM (#19225269) Homepage
    What exactly do they mean by the "the writing is on the wall" for 1024-bit encryption? Does the 307 digit long set of prime factors have some bearing on the ability to break encryption, or is it just a reference to the amount of sheer computing power out in the industry today?

    I'm having a hard time making the coorelation.

  • by iamacat ( 583406 ) on Tuesday May 22, 2007 @01:48PM (#19225273)
    I just factored 2^2048 in a few milliseconds on a single computer. Your bank account balance was just donated to support world peace. RSA is doomed? Oh, wait? Are you saying RSA is based on numbers which are products of two large primes, not just some numbers with lots of small factors? Bummer!
  • ummm.... (Score:2, Offtopic)

    by Ace905 ( 163071 )
    Ok, I know this is an overplayed argument - the 'humanity' card. Like when NASA announces they've found a way to get 3 men to the moon for just under 8 billion dollars - and people say, "Umm, couldn't we use 8 billion dollars in Florida for our worst-in-the-country school system?"

    Obviously, that's a long and involved argument. But in this case - factoring a very large prime number - just by using methods we *knew* would work - but had never dedicated the resources to - what kind of real progress is that?
    • by Meostro ( 788797 )
      It might be a better idea to use the CPU hours for something else, but in terms of either mathematics or CS this is pretty big.

      Lenstra was even quoted in the article to counter your argument of no progress/learning - "We have more powerful computers, we have come up with better ways to map the algorithm onto the architecture, and we take better advantage of cache behavior." This is an incremental improvement that gives us benefits outside of just factorization, so I would say that the greater good may have
  • by Palmyst ( 1065142 ) on Tuesday May 22, 2007 @02:03PM (#19225525)
    From http://www.ddj.com/blog/portal/archives/2007/05/wo rld_record_fo.html [ddj.com] Using the sieve program developed at the University of Bonn, NTT, EPFL, and the University of Bonn respectively provided 84.1 percent, 8.3 percent, and 7.6 percent of the calculation resources, and the calculation amount equivalent to 95 years of operation on a 3-Ghz Pentium D. PC clusters at NTT and EPFL, consisting of 110 and 36 PCs, respectively, were run in parallel for more than two months for the calculations. The results were 47 non-trivial solutions of the simultaneous equations defined by an approximate 70,000,000 x 70,000,000 large sparse linear matrix.
  • The largest RSA number so far factored is only 663 bits. 512 bits in 1999, 576 bits in 2003, 663 bits in 2005. Call it 100 bits improvement in 2 years. At this rate we should be due for a 700 and some bit number this year, with 1024 bits 5-10 years away.

    The RSA Factoring Challenge [wikipedia.org] has been suspended, i.e. they are no longer giving out prize money, but the numbers still stand as a good reference for where we are in comparison to 1024. There's a lot of mileage between here and there.
  • Quadruple AES? (Score:3, Interesting)

    by W2k ( 540424 ) on Tuesday May 22, 2007 @02:06PM (#19225561) Journal
    I'm hoping there are some crypto geeks in the audience who can answer this. I know that back in the days when DES (with 56-bit keys) was the best there was, some genius invented TDES, which was simply three passes of DES, for a total key length of 168 bits. However, running DES thrice does not triple the "security" (resistance to brute-force cracking) of the cipher, rather the 168 bit key provides security equal to that of a 112 bit key due to some mathematical technicality that I've forgotten.

    Now for my actual question. There isn't a symmetric crypto algorithm that I know of that can use 1024 bit keys (except for stream ciphers, maybe RC4?); the best block cipher is AES (Rijndael) which supports 256 bit keys. If one would "invent" QAES, i e quadruple QAES, for a total key length of 1024 bits, what would the "effective" key length be?
    • Re:Quadruple AES? (Score:4, Informative)

      by Meostro ( 788797 ) on Tuesday May 22, 2007 @02:34PM (#19226031) Homepage Journal
      It depends entirely on how you're doing your QAES.

      The standard 3DES process is 3DES-EDE which uses 2 keys, thus giving you 112 bits.
      ENCRYPT data with key1
      DECRYPT output with key2
      ENCRYPT output with key1

      Since DES is symmetric, any paired combination of encrypt and decrypt will give you the same result. You can do E(D(data)) to get your result, or you can use D(E(data)) for the same thing. If you used the same key for key1 and key2, this would be the same as doing regular DES, and would just take 3x as long.

      If you used three different keys for your 3DES instead, you would have the 168-bit key length. Thus, you can apply the same concept to 4AES, and depending on which way you do it you will end up with 256-, 512-, 768- or 1024-bit key strength.
    • by swillden ( 191260 ) * <shawn-ds@willden.org> on Tuesday May 22, 2007 @04:18PM (#19227871) Journal

      The reason 3DES provides an effective key length of 112 bits, not 168, isn't because only two keys are used instead of three. We only bother using two keys because the effective length of three-key 3DES is still only 112 bits, so there's little reason to bother storing and managing a third.

      The reason the effective length is only 112 bits is something called the "Meet in The Middle" attack. Suppose three keys were used and that the attacker has plaintext and ciphertext to mount a known-plaintext attack. An attacker can apply the first encryption step to the plaintext message using all possible 56-bit keys and then store the results in a big dictionary. Then, the attacker picks a 112-bit key and performs the first two decryption steps on the ciphertext. If the result is in the dictionary, then the attacker has probably found all three keys. If not, he picks another 112-bit key and tries again. So the attacker's work is (a) the effort required to create the dictionary plus (b) the effort required to brute force search a 112-bit keyspace. Since (b) completely dominates (a) we can ignore (a) and use (b) as our estimate of the attack complexity.

      In the case of any quadruple encryption, then, the Meet in the Middle attack would require building a dictionary of all possible encryptions using the first two keys, then brute forcing the space of the last two keys. So, the effective strength is equivalent to the size of two of the four keys. Quintuple encryption is equivalent to three keys. Double encryption is equivalent to one key, which is why 2DES was never used.

      What does all of this have to do with 1024-bit RSA keys? Not a thing. 1024-bit RSA keys consist of numbers that are the product of two 512-bit prime numbers. That means they're pretty sparse among the set of all possible 1024-bit numbers, and it means they have a particular mathematical structure that can be exploited.

      Symmetric ciphers, like AES, are different. Unless there's something wrong with them, their keyspaces are flat, meaning that if they use n-bit keys, every possible n-bit value is a legitimate key. They have no particular mathematical properties, and there is no way to identify the right one except by trying them all.

      So, assuming that AES doesn't have some weakness that allows attacks faster than brute force to succeed, how long until we need to use keys bigger than 256 bits?

      Forever, basically. Barring weaknesses in the algorithm, 256-bit symmetric keys are safe until, as Bruce Schneier put it "computers are built from something other than matter and occupy something other than space."

      In Applied Cryptography he outlines an interesting little computation to demonstrate why this is. Suppose you had a computer that contained a 256-bit register that was maximally efficient, meaning that toggling a bit required exactly one quantum of energy. Since smaller units of energy don't exist, you can't do better than that[*]. With that assumption, you can calculate how much energy it would take to cycle your 256-bit counter through all possible states. Schneier calculates that if you could capture all of the energy from a typical supernova and run your counter on that, you could count from 0 all the way up through about 2^219. So you'd need about 130 billion supernovas to run your counter through all of its 2^256 possible states.

      That completely ignores the energy you'd need to perform a trial encryption with each of those values, and it also completely ignores how long it would take to perform all of these operations.

      Quantum computers that can somehow handle the complex structures of symmetric ciphers, or some other radical change in computing technology would be required to make 256-bit keys accessible to brute force. A flaw in AES is far more likely, IMO.

      [*] Just because someone will call me on it, I should point out that reversible computing means that in theory you might be able to do better than the theorized maximally-efficient computer. In practice, that probably isn't going to make your energy budget small enough to be reachable, and it certainly isn't going to help with the time factor.

  • any security measure built by a man can also be broken by a man

    there is absolutely no such thing as 100% security

    and there never will be

    for most of us, 99.9999999999999999999999% security will do

    for the rest, sweaty heart palpitations and paranoid schizophrenia will do
  • by wamatt ( 782485 ) on Tuesday May 22, 2007 @02:30PM (#19225979)
    Not sure if this is a new idea, but this topic got me thinking. Decrypting something means is really just a mathematical transform. We say its "decrypted" if the end result "makes sense". But what if we didn't know what the final data should look like? How would we ever know it was decrypted?

    Decryption itself only makes sense once we know what method was used, ie RSA, DES, Blowfish etc. However what if that algorithm itself was dynamic and formed part of the encryption? Sort of like a more generalised version of onion encryption, ie encrpyting the same content a number of times using different algorithms. So that the algorithms used and the sequence in which they are used form a sort of "meta-key"

    • Re: (Score:3, Insightful)

      by Detritus ( 11846 )
      Unless your opponent is encrypting white-noise for kicks, the result of a successful decryption is going to have statistical properties that are significantly different from those of an unsuccessful decryption. Of course, it's helpful if you have more information, such as the source material is ASCII text, or every file starts with a known magic number.
  • by Podcaster ( 1098781 ) on Tuesday May 22, 2007 @02:59PM (#19226429) Homepage Journal

    From TFA:

    Is the writing on the wall for 1024-bit encryption?"The answer to that question is an unqualified yes," says Lenstra. For the moment the standard is still secure, because it is much more difficult to factor a number made up of two huge prime numbers, such as an RSA number, than it is to factor a number like this one that has a special mathematical form. But the clock is definitely ticking."Last time, it took nine years for us to generalize from a special to a non-special hard-to factor number (155 digits). I won't make predictions, but let's just say it might be a good idea to stay tuned."

    Reading Lestra's comments, I get the feeling that he has a fairly high degree of confidence that they will succeed in making the leap to a mathematical generalization within a modest time frame.

    Can any security researchers tell me what GPG key length I should be using in the real world to give me a good trade-off between computational simplicity and future security please? I'm only using crypto for email and secure file storage.

    -P

  • by trifish ( 826353 ) on Tuesday May 22, 2007 @04:05PM (#19227647)
    The lead researcher believes "the writing is on the wall" for 1024-bit encryption.

    It should be noted (before people start panicking) that the 1024-bit key size refers to asymmetric or public-key cryptography (software like PGP, GPG, algorithms like DH/DSS, RSA), not to symmetric cryptography (software like TrueCrypt, algorithms like AES, Blowfish, Twofish, etc.).

    A 256-bit symmetric key is completely infeasible to brute force and even if quantum computers become available, the complexity of brute force attack will still be 2^128 (which is again infeasible to brute force).

Be sociable. Speak to the person next to you in the unemployment line tomorrow.

Working...