Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Math

Mathematician Warns US Spies May Be Weakening Next-Gen Encryption (newscientist.com) 78

Matthew Sparkes reports via NewScientist: A prominent cryptography expert has told New Scientist that a US spy agency could be weakening a new generation of algorithms designed to protect against hackers equipped with quantum computers. Daniel Bernstein at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for "post-quantum cryptography" (PQC). He also believes that NIST has made errors -- either accidental or deliberate -- in calculations describing the security of the new standards. NIST denies the claims.

Bernstein alleges that NIST's calculations for one of the upcoming PQC standards, Kyber512, are "glaringly wrong," making it appear more secure than it really is. He says that NIST multiplied two numbers together when it would have been more correct to add them, resulting in an artificially high assessment of Kyber512's robustness to attack. "We disagree with his analysis," says Dustin Moody at NIST. "It's a question for which there isn't scientific certainty and intelligent people can have different views. We respect Dan's opinion, but don't agree with what he says." Moody says that Kyber512 meets NIST's "level one" security criteria, which makes it at least as hard to break as a commonly used existing algorithm, AES-128. That said, NIST recommends that, in practice, people should use a stronger version, Kyber768, which Moody says was a suggestion from the algorithm's developers.

NIST is currently in a period of public consultation and hopes to reveal the final standards for PQC algorithms next year so that organizations can begin to adopt them. The Kyber algorithm seems likely to make the cut as it has already progressed through several layers of selection. Given its secretive nature, it is difficult to say for sure whether or not the NSA has influenced the PQC standards, but there have long been suggestions and rumors that the agency deliberately weakens encryption algorithms. In 2013, The New York Times reported that the agency had a budget of $250 million for the task, and intelligence agency documents leaked by Edward Snowden in the same year contained references to the NSA deliberately placing a backdoor in a cryptography algorithm, although that algorithm was later dropped from official standards.

This discussion has been archived. No new comments can be posted.

Mathematician Warns US Spies May Be Weakening Next-Gen Encryption

Comments Filter:
  • Why are we still using 128 bit algorithms at all? Is there a good reason why we don't set the bar at 1024 bit minimum with 2048 or 4096 mandatory for banking or high security applications???
    • We use 4096 on newer equipment for ssh. When it comes to public facing ssl, I think higher encryption still requires a lot of processing. (and extra cost)
      • I use 16384 for public facing ssh.....
      • by Anonymous Coward

        If a government picked an intentionally broken algorithm the number of bits won't matter.

        Probably safest to cascade two different opposing government's standards.

        That way they'd both have to collude to use a backdoor'd algorithm.

        • If a government picked an intentionally broken algorithm the number of bits won't matter.

          Probably safest to cascade two different opposing government's standards.

          That way they'd both have to collude to use a backdoor'd algorithm.

          It looks like this is how it is panning out. They are stacking the deck so Kyber wins because the NSA wants Kyber512 because they can break.
          NTRU-prime should have won on all the metrics.

          So the PQC KEM competition is compromised and should not be trusted.

          • If a government picked an intentionally broken algorithm the number of bits won't matter.

            Probably safest to cascade two different opposing government's standards.

            That way they'd both have to collude to use a backdoor'd algorithm.

            It looks like this is how it is panning out. They are stacking the deck so Kyber wins because the NSA wants Kyber512 because they can break.
            NTRU-prime should have won on all the metrics.

            So the PQC KEM competition is compromised and should not be trusted.

            Did they not think "If we can break this, who else can?" or don't they even care?

            • If a government picked an intentionally broken algorithm the number of bits won't matter.

              Probably safest to cascade two different opposing government's standards.

              That way they'd both have to collude to use a backdoor'd algorithm.

              It looks like this is how it is panning out. They are stacking the deck so Kyber wins because the NSA wants Kyber512 because they can break.
              NTRU-prime should have won on all the metrics.

              So the PQC KEM competition is compromised and should not be trusted.

              Did they not think "If we can break this, who else can?" or don't they even care?

              They certainly don't care. Look at the history with the Dual-EC-DRBG, slow responses to SHA1 breaks, the entropy reducing CRNGT, the SP800-90A DFs.

              • You can add Kyber-512 to the list, with NIST, under pressure from the NSA forcing kyber512 to be the winner of the PQC-KEM competition over the obviously better alternatives including NTRU-prime and overstating the cryptographic strength of the algorithm to give the impression that it meets the security properties required (I.E. at least as strong as AES128) while it is clearly fails to meet the requirement.

                https://blog.cr.yp.to/20231003... [cr.yp.to]

      • by ls671 ( 1122017 ) on Saturday October 14, 2023 @03:24AM (#63924307) Homepage

        We use 4096 on newer equipment for ssh. When it comes to public facing ssl, I think higher encryption still requires a lot of processing. (and extra cost)

        Not so much more IMHO, keep in mind that most encrypted communications use a something like a ephemeral shared key which is much easier to crack that the 4096 used by the cert. Renegotiation of the shared key may happen several times during a session and the 4096 bit key is only used during such negotiations.

        The point to remember is that your 4096 key isn't used at all for actually encrypting the data being transmitted, only to negotiate a key to encrypt that data, in most protocol anyway, especially TLS (SSL).

      • by sinkskinkshrieks ( 6952954 ) on Saturday October 14, 2023 @04:11AM (#63924345)
        If you're talking about RSA 4096, which can be used in both SSH and TLS, the public prime, e, lacks maximum bit entropy because it has to be prime. This makes it unfair to compare with, say, AES-512.
    • AES 128 is still very secure
      • Safe for an "average home keyboard warrior" sure, but their in a government backed attacker or just a bit of quantum and that goes right out the window...
        • no (Score:2, Informative)

          by Anonymous Coward
          AES is not vulnerable to quantum computers. It's possible (not terribly likely) the government has cooked up a way to crack AES as well but normally the way it would be cracked is, the initial key exchange is cracked which gives up the AES key.
        • by sjames ( 1099 )

          In fact, According to Schneier, there is reason to believe AES256 might be MORE vulnerable than AES128. Quantum is useless against AES.

      • by Rosco P. Coltrane ( 209368 ) on Friday October 13, 2023 @10:41PM (#63924035)

        AES 128 is still very secure

        That is not the issue.

        The thing people fear is that nefarious corporate surveillance outfits and government agencies are quietly storing all encrypted data traffic for later analysis in the future when computers (traditional or quantum) become powerful enough to decrypt it.

        That's what quantum-proof cryptography is about: there is no quantum computer today that's capable of decrypting today's state-of-the-art encryption schemes but there will be. People are busy devising new encryption methods to future-proof the encrypted data generated today.

        So yeah, AES might be okay today. But then don't encrypt something that you don't want in the clear tomorrow. If you need to protect something critical that won't matter tomorrow, then it's fine I guess.

        • That's what quantum-proof cryptography is about: there is no quantum computer today that's capable of decrypting today's state-of-the-art encryption schemes but there will be. People are busy devising new encryption methods to future-proof the encrypted data generated today.

          So yeah, AES might be okay today. But then don't encrypt something that you don't want in the clear tomorrow. If you need to protect something critical that won't matter tomorrow, then it's fine I guess.

          Symmetric algorithms like AES are at no known risk from PQC in the first place.

          The summary is confused. AES and Kyber are different things. AES is a symmetric encryption algorithm, kyber is a kind of key agreement.

          What's at risk in terms of a code breaking class of quantum computer from store today / decrypt tomorrow is compromising present day schemes for forward security not brute forcing AES.

          That's what quantum-proof cryptography is about: there is no quantum computer today that's capable of decrypting today's state-of-the-art encryption schemes but there will be.

          At at present nobody has any clue how to achieve it. The assumption it will ever happen has no objective basis.

      • AES138 as embedded in cheap Chinese tat, are all broken and don’t actually do anything at all.
    • by quenda ( 644621 ) on Saturday October 14, 2023 @12:54AM (#63924187)

      Why are we still using 128 bit algorithms ... 2048 or 4096 mandatory for banking or high security applications???

      You are confusing public keys with conventional symmetric keys. Those numbers are apples and oranges. Two very different sorts of algorithms. Nobody is using 128-bit public key encryption, ever.
      But you might use a 2048 bit public/private key to encrypt a 128-bit session key. That 128-bit key is used both to encrypt and decrypt your document.

      https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by sinkskinkshrieks ( 6952954 ) on Saturday October 14, 2023 @04:06AM (#63924337)

      You're confusing a number of things.

      The number of bits in different types of encryption algorithms (I assume you mean private and symmetric key length) aren't apples-to-apples comparable. Public key cryptography uses longer keys (primes) that don't have the bit entropy of symmetric (CSPRNG or PBKDF-based) keyed algorithms. EC has higher bit entropy than public key cryptography but it is still less than symmetric keys.

      Increasing the # of rounds can create as much work as you want or have computing resources to accomplish.

      For a given use, it's best to put together an attack cost-based risk mitigation plan where the design parameters (rounds, work factors) increase over time. Also, building for interchangeable hashing, HMAC, encryption, key exchange, etc. is helpful.

      I suggest taking Dan Boneh's intro cryptography course because it's very good.

    • by Entrope ( 68843 )

      The 1024-, 2048- and 4096-bit levels you mention are on a different scale than the 128-bit level of AES-128. AES-128 is comparable to 3072-bit RSA or (non-elliptic-curve) Diffie-Hellman. https://en.wikipedia.org/wiki/... [wikipedia.org]

    • 128 is fine for symmetric cryptography. Well fine as long as you are confident that quantum computers that can break crypto aren't going to happen.
      We have been moving most symmetric crypto to 256 so that it is resistant to hypothetical quantum computers.

      1024, 2048, 4096 are key sizes for asymmetric crypgtography like RSA. You don't need keys that big for equivalent security with elliptic curve crypto. But with ECC you should stick with safe curves like Ed25519 (technically an Edwards curve, not an elliptic

    • Because symmetric and asymmetric cryptography have different key lengths requirements: for the former 128 bit keys provide an adequate safety margin.
    • by eric76 ( 679787 )
      If you are thinking about RSA 4096 bit or higher encryption, it is an error to compare the 4096 bits of RSA with 128 bit encryption. I think that the 4096 bit RSA is essentially 128 bit encryption. As I understand it, what the 4096 RSA does do for you is to make the determination of the private key from the public key far more difficult than a 2048 bit RSA key. So it is probably worthwhile to use 4096 RSA for that purpose. For my own use, I generally use the ED25519 (Elliptic Curve 25519) keys. The keys
    • When you see 128/256 think symmetric, when you see 1024/2048/4096 think asymmetric.

      Keeping it simple, when we're talking about something like https, you use both types. The asymmetric encryption protects a symmetric key that has to be exchanged, then your data is encrypted with the symmetric key algorithm.

      So there's some fuzzy math to decide that 2048 but whatever algorithm is equivalent to a so many bit symmetric algorithm, and you want the asymmetric one to be as strong as or stronger than the symmetric o

  • He says that NIST multiplied two numbers together when it would have been more correct to add them

    It's been decades since I learned additions and multiplications and I'm still on the fence myself...

    says Dustin Moody at NIST. "It's a question for which there isn't scientific certainty and intelligent people can have different view

    That sure sounds like hogwash: cryptography is math. How isn't there scientific certainty and how do people have different opinions in math? It's either correct or it isn't.

    • "It's a question for which there isn't scientific certainty and intelligent people can have different views. We respect Dan's opinion, but don't agree with what he says."

      I believe that's governmentspeak for 'you're correct, but we don't care because we have competing motivations.'

    • Well, there's reasonable doubt because you don't have an existing real-world quantum computer of any serious capability to test it against, and can't judge how fast quantum capability will advance once it becomes generally available.

      • We have some ideas of what the theoretical speedups would be - independent of hardware, Grover's algorithm only gives a quadratic speedup....

    • by Wrath0fb0b ( 302444 ) on Saturday October 14, 2023 @12:29AM (#63924171)

      The question is about calculating the strength, which is (log of) the number of operations required to break it (modulo some constants). For symmetric ciphers this is just key size since it requires trying every possible key. For asymmetric algorithms, however, it's usually about the number of operations to compute the private key for a given public key. So RSA using a 2048b key has a strength of 112 because you have to do approx 2^112 operations (using GNFS) to factor the public key.

      This implies that to compute the strength of an asymmetric algorithm, you need to know the best possible factorization method. If you didn't know about GNFS [umd.edu], you would think RSA2048 was stronger than it really is. And in the (unlikely)

      In this case, the disagreement boils down to whether Grover's algorithm [wikipedia.org] is the best possible quantum method, which implies quadratic complexity (multiply two numbers) or if a linear search will be possible (add two numbers). As no quantum computers even exist, let alone enough time to believe we've found the best possible method to run on them, this is hardly a "scientific question".

    • That sure sounds like hogwash: cryptography is math. How isn't there scientific certainty and how do people have different opinions in math? It's either correct or it isn't.

      If you have the power to always tell whether math is correct or not, then maybe you can enlighten us all on a longstanding question:

      Does P == NP?

      • by Entrope ( 68843 )

        Only when N=1, P=0, or N is finite and P is infinite! Man, this is easy. Or I'm so smart.

        (Yes, I know [wikipedia.org].)

  • Dammit! (Score:4, Funny)

    by Tablizer ( 95088 ) on Friday October 13, 2023 @10:37PM (#63924031) Journal

    ...the CIA should look into this! ... oh, wait

  • history repeating? (Score:5, Insightful)

    by zeiche ( 81782 ) on Friday October 13, 2023 @10:57PM (#63924057)

    trying to remember if the US government ever purposefully weakened an encryption key in the past. hmm.

    • by steveb3210 ( 962811 ) on Friday October 13, 2023 @11:46PM (#63924125)

      They've also strengthened schemes in the past - they made changes to the S-boxes in DES that people thought for years might have been funny business but it turns out they fixed issues with techniques that were not yet widely known publically.

    • by AmiMoJo ( 196126 )

      It's very likely that they recommended the Dual EC DRBG random number generator because they knew it was weak. NIST and RSA adopted it as a standard, but have since deprecated it.

      These days the assumption is that any schemes produced by government controlled agencies are suspect.

  • by sxpert ( 139117 ) on Friday October 13, 2023 @11:36PM (#63924105)

    and they're lying again...

  • by MacMann ( 7518492 ) on Friday October 13, 2023 @11:46PM (#63924127)

    Even with infinite computing power the one time pad will remain secure. I've had people try to dispute this with me before so I'll make an analogy to point out how it cannot be broken. Imagine a recording of a cocktail party, one such that every voice of every conversation isn't any louder or more prominent than any other. Now, with that recording consider overlaying the voice that carries the intended message. Without any means to discern which voice carries the intended message there's no telling what is trying to be conveyed, it's all just noise. But to the two parties that have an identical recording, one that is sending the intended message and the other receiving, the intended voice can be extracted easily by subtracting out all the other voices.

    The key to the one time pad is that it is used only one time. Keep using the same recording of a cocktail party and eventually the party listening in will have enough of the same noise on the transmission to build a pattern, and once they have that key then every transmission that used the same background noise to cover up what was conveyed can have the desired information extracted. Keep the background noise sufficiently unique and there's no means by which to create a useful pattern.

    With every encryption system we use today the pattern will eventually repeat. I recall something about a cellular or WiFi system where it took something like two weeks to repeat, meaning that by listening in continuously for that long then all past and future communications were no longer secure. In real world conditions the time needed might be longer due to noise or something in the system, or shorter because there was enough of a pattern built to make educated guesses without a complete key. Of course the longer people listen the more time they have to not only gather data but also verify their early attempts at educated guesses were correct or not. With a long enough key, and a strong enough algorithm, the time needed to build a pattern becomes such that anything extracted is no longer useful or the time so long is such that it is mathematically improbable for the conversation to continue long enough for the pattern to repeat. If the pattern doesn't repeat then that could be mathematically no different than a one time pad, a random enough sequence that it becomes unbreakable.

    I find it improbable for there to be some computer or algorithm to be powerful enough to break a well conceived encryption system. At a minimum the one time pad is not breakable, and as much as people might argue otherwise there's plenty to prove it cannot be broken. The problem with the one time pad is creating truly random noise to hide the intended message, and finding the means to convey these one time pads in a secure manner. The pad for the message would be as large as the message, meaning if someone wants to secure one terabyte of data then the pad is also one terabyte. That kind of bandwidth requirement creates all kinds of problems, which explains why we don't see the one time pad used everywhere in spite of being unbreakable.

    • by AmiMoJo ( 196126 )

      The problem with a one time pad is the need for key distribution and storage. If you want to encrypt 1GB of data, you need 1GB of one time pad as well. As you say, if you use the pad more than once, it is weakened to the point of being largely useless.

      That's why most systems use a very computationally expensive encryption scheme, often public key, to protect a much more efficient symmetric key. The problem with schemes like that is that there are multiple ways in which the symmetric key can be recovered. Th

      • The problem with a one time pad is the need for key distribution and storage. If you want to encrypt 1GB of data, you need 1GB of one time pad as well. As you say, if you use the pad more than once, it is weakened to the point of being largely useless.

        There are probably a number of applications where this would be just fine. A pair of identical SD cards filled once could easily offer a lifetime of text and voice communications between any two parties.

        That's why most systems use a very computationally expensive encryption scheme, often public key, to protect a much more efficient symmetric key. The problem with schemes like that is that there are multiple ways in which the symmetric key can be recovered. The public key crypto used for key exchange could be flawed, the symmetric key might be recoverable from either end of the system, or the random number generator used to produce any of the keys might be flawed.

        But there is no better system, unless you can somehow pre-share your one time pad securely.

        If you are going to go through the trouble the only solution is to initially establish pads in person or distribute them via a mutually trusted party.

        Even that has downsides, because if the pad is compromised so are all past and future communications that use it, where as with per-session keys and perfect forward secrecy the damage is much more limited.

        Past communications are protected by simply deleting portions of the pad as it is used. As far as future communications if one or both parties systems are co

  • We should expect to eventually say "bye" to the days of public/private keys.
    People requiring the utmost secrecy will need to meet ahead of time and set up a lengthy one-time pad for future communications.
    This could be extended somewhat - though I have never seen it done - for trusted friends.

    • by jd ( 1658 )

      Why would they need to share a one time pad? There are plenty of random interstellar phenomena that could be used as a PRNG. In that case, you only need to share the target identifier and sample time. The encryption and decryption would involve simultaneous measurement and immediate application of the random number.

      This is better than sharing a pad, because a pad can be stolen after the event and used to decrypt the message. You can't do this with a pad from an interstellar source because it's never recorde

  • They intend to strong arm industry to switch to PQC despite the total lack of affirmative evidence to justify any such a change.

  • OTP works for small messages but doesn't scale. It's a utopian mirage. Just develop reliable PQC.
    • ^ The way to develop QC-hard (but also mem-, CPU-, GPU-, ASIC-, FPGA-hard) algorithms is to consider the construction costs of optimal brute implementations. QC is no more powerful in capabilities than a Turing machine; it's just faster at certain calculations. There are published recommendations for QC-hard algorithm instance choices.
  • After Dual_EC_DRBG, the taint of NSA makes anything that comes out of NIST suspect. NIST should therefore be marginalized and not given legitimacy.
    • If DJB doesn't like it and they're intentionally dismissing professional requests for transparency, then they have something to hide.

      No one should use any cryptography products of NIST even if there are no NSA backdoors because the process is flawed. Therefore, anyone who deploys algorithms based on closed, arbitrary, and unvetted magic would be foolish.

  • by chx496 ( 6973044 ) on Saturday October 14, 2023 @05:12AM (#63924401)

    What I have been missing from all articles about this topic is an actual explanation what the primary technical criticism is all about. So I've skimmed djb's blog post about the issue [cr.yp.to] and what he's arguing about the complexity is the following:

    According to djb, the analysis of Kyber512 done by NIST argues the following:

    - It will likely take 2^95 iterations for an attack on the algorithm to succeed.

    - There are 2^25 bit operations (calculations) required per iteration.

    - There are 2^35 memory accesses required per iteration.

    - Hence there are 2^(95 + 25 + 35) = 2^155 operations required to attack the algorithm. (According to NIST.)

    djb points out that this is very misleading, since the total amount of time a single iteration takes is not 2^25 * 2^35 (=2^60), but instead something like 2^25 + 2^35 (which is just a little more than 2^35), so you'll get a total complexity of 2^95 * 2^35 = 2^(95 + 35) = 2^130. (And NIST is themselves targetting 2^140 operations for their new standard.)

    I haven't looked at the original NIST document analyzing Kyber512 to see if djb's claim about what they're arguing is indeed an accurate representation, but if djb isn't misunderstanding and/or misrepresenting the analysis in the original NIST document (i.e. the bullet points I provided here are indeed what NIST is using to calculate the attack complexity), then this is a huge blunder (and one has to wonder whether this is intentional), because djb is 100% correct that this is a mistake that nobody with even just an undergrad degree in CS should be able to make, let alone somebody's job it is to analyze crypto algorithms.

    And while I have not read the original analysis by NIST, I tend to believe djb here, because if djb had simply misunderstood the NIST analysis and the bullet points above are not what the analysis is using to estimate attack complexity, then the person at NIST responding to this could easily refute that, instead of some BS such as It’s a question for which there isn’t scientific certainty and intelligent people can have different views.

    Sure, there are certainly areas where intelligent people can reasonable disagree about things (for example in how high the threshold for security should be set for the future), but in this case? NIST's analysis is either correct when it comes to the possible attack complexity, or it isn't, and that shouldn't be a matter of debate.

    • There are two conflicting ideas here - that you can trust any encryption the NSA is involved in when it's their job to leave themselves a backdoor, and that those same people would choose such an easily-discovered way to try and hide whatever weakness they'd inevitably introduced.

      I'm inclined to believe the weakened encryption was deliberate, and the dumb cover story was a mistake.

      Overall it's pretty stupid anyway, though... because there is no encryption that is 'easy for us to crack, difficult for them to

  • The NIST argument, from one of their posts on the mailing list: the supposed error only concerns our interpretation of one of several possible methods we cited from published literature for estimating memory access costs in lattice attacks....We emphasize that we are not claiming that 2^191 bit operation equivalents (or any similar number derived from adjusting NTRUprime's analysis to include subexponential factors) is an accurate estimate of the security of Kyber512. As noted earlier in this email, we only

    • by jd ( 1658 )

      Bernstein has asked two short questions on the list asking for clarification about NIST's claims. And he does NOT sound like a happy camper.

      As with the SHA3 mailing list, I'm seeing a lot of heated debate, flaring tempers, and ambiguous claims. At this point, I'm of the opinion that it doesn't matter who is right, it matters far more that researchers get counseling and blood pressure meds.

      I'm glad I turned down cryptography as a final year module if that is what maths courses do to you.

  • Cisco persist in hard coding in the NSA's backdoors.
  • Kyber specifications: At a complexity of 161 bits, the secret keys are 2400 bytes, the public keys 1184 bytes, and the ciphertexts 1088 bytes in size. Kyber1024 has a secret key size of 3168 bytes, a public key size of 1568 bytes, and a cyphertext size of 1568 bytes. Shared secret size is 32 bytes for all forms.

    This is not small. If I'm reading this correctly, the key sizes are 4x larger than those used in typical RSA deployments. Yes, Kyber and RSA are only used for key exchange, almost nobody uses public

  • A CIA and BND front company that provided encryption gear to many countries after WWII that was completely backdoored. Not much changes in the spy game.
  • Maybe never. But the backdoors places in supposedly "quantum safe" encryption are real. Suddenly it makes a lot of sense why people that should know better keep prediction Quantum Computers for the nearer future.

    • by jd ( 1658 )

      China recently announced a recent breakthrough in quantum computing. Sure, it wasn't a massive amount, but it supposedly was performing computations that would take billions of years on the fastest regular supercomputer.

      I'm going to say serious quantum computing won't be around for a while, maybe 80 years before cryptographic algorithms are at any real risk, but that's a timeframe that's short enough for pre-pqc intercepted messages to still have a lot of value.

      (I'm basing the timeline on my observation tha

      • China recently announced a recent breakthrough in quantum computing.

        What else is new?

        I'm going to say serious quantum computing won't be around for a while, maybe 80 years before cryptographic algorithms are at any real risk, but that's a timeframe that's short enough for pre-pqc intercepted messages to still have a lot of value.

        I'm basing the timeline on my observation that all technology develops along an exponential curve

        Technology tends to develop along a logistic curve which most certainly does NOT exponentially increase forever.

        Quantum computing would seem, on the face of it and provided we're given accurate data, to be evolving on a curve that's taking twice as long to reach comparable milestones as classical computing. It's then just a matter of figuring out how long it took classical computers to pose a serious threat to regular cryptography. Double that time, minus the time quantum computing has already taken, should be when quantum computers become a serious threat.

        Why are the things you are comparing at all comparable?

      • by gweihir ( 88907 )

        Sorry, but I have been following this stuff for 35 years now. It is going nowhere. After _this_ long there are really only two possibilities left: 1. It cannot work and 2. it requires some major fundamental breakthrough that has not happened yet.

    • I would be OK with a scheme where to break it, you have to defeat both RSA and Kyber. I'm not a crypto expert, but wouldn't that be a good strategy both to appease those suspicious of rushing a post-quantum algorithm (your view) and those who think that there's at least a chance that the secrets my lawyer communicates that currently rely on RSA might still be sensitive at a time far in the future, when a reliable quantum computer might be feasible (my view)?

  • https://blog.cr.yp.to/20231003... [cr.yp.to]

    I read that, then took a look at the spec and NIST documents and it looks like DJB is correct.
    I have some sway on what algorithms go into the products made by my employer. Kyber is now off the list. It cannot be trusted.

  • Bernstein seems to be correct that NIST did something dumb in calculating the time needed to break this algorithm. Basically, they said each iteration requires this expensive giant array access which takes about the time needed for 2^35 bit operations and that each iteration requires 2^25 bit operations. However, rather than adding the cost of the memory access to the cost of the bit operations in each iteration they multiplied them. That's bad [1].

    But then Bernstein has to imply that this isn't just you

    • by jd ( 1658 )

      I'm on the mailing list and have been following the discussion.

      NIST's argument seems to hinge on the fact that there are certain costs (such as memory) that are best calculated by multiplying even if there are other costs that are best calculated by adding, because no resource is free.

      Bernstein's counter arguments, at least at the moment, seem to revolve around the fact that NIST is claiming to be quoting the developers on some of this but haven't given any citations despite being repeatedly asked.

      My impres

  • For my e-mail, I have both RSA4096 and ED25519 PGP keys that are published on keys.openpgp.org. One thing that bothers me, though, is that for every '+' style alias, we need a different key. For example, 123456@example.com and 123456+ 3.14159265358979323846264338@example.com need different keys. This wouldn't be much of an issue if it were not for the fact that I use the '+' aliases extensively in order to make it easier to filter my e-mail into the proper folder on arrival. While few people send me encr
  • They already did this crap with current standards. They got caught with their hand in the cookie jar when they pushed a known weak random number generator to be used with elliptic curve crypto. Still unaddressed, they chose an elliptic curve algo that depends on strong random number generation to prevent deriving the secret key over one that doesn't depend on the strength of the randomness at all.

Dynamically binding, you realize the magic. Statically binding, you see only the hierarchy.

Working...