Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software Science Hardware Technology

Researchers Build True Random Number Generator From Carbon Nanotubes (ieee.org) 144

Wave723 writes: IEEE Spectrum reports on a true random number generator that was created with single-walled semiconducting carbon nanotubes. Researchers at Northwestern University printed a SRAM cell with special nanotube ink, and used it to generate random bits based on thermal noise. This method could be used to improve the security of flexible or printed electronics. From the report: "Once Mark Hersam, an expert in nanomaterials at Northwestern University, and his team had printed their SRAM cell, they needed to actually generate a string of random bits with it. To do this, they exploited a pair of inverters found in every SRAM cell. During normal functioning, the job of an inverter is to flip any input it is given to be the opposite, so from 0 to 1, or from 1 to 0. Typically, two inverters are lined up so the results of the first inverter are fed into the second. So, if the first inverter flips a 0 into a 1, the second inverter would take that result and flip it back into a 0. To manipulate this process, Hersam's group shut off power to the inverters and applied external voltages to force the inverters to both record 1s. Then, as soon as the SRAM cell was powered again and the external voltages were turned off, one inverter randomly switched its digit to be opposite its twin again. 'In other words, we put [the inverter] in a state where it's going to want to flip to either a 1 or 0,' Hersam says. Under these conditions, Hersam's group had no control over the actual nature of this switch, such as which inverter would flip, and whether that inverter would represent a 1 or a 0 when it did. Those factors hinged on a phenomenon thought to be truly random -- fluctuations in thermal noise, which is a type of atomic jitter intrinsic to circuits." Hersam and his team recently described their work in the journal Nano Letters.
This discussion has been archived. No new comments can be posted.

Researchers Build True Random Number Generator From Carbon Nanotubes

Comments Filter:
    • 42

      // chosen by fair dice roll. guaranteed to be random.

    • I am not sure of the benefit purpose of a pure random generation. Or concept of data encryption relies on the fact that we can get the same random numbers, given the correct key. Now we can improve on the randomness of these numbers so the next value will be less predictable, but it will still need to be reproducible for so the system can decrypt the data.

      Other uses of random data, for the most part seem random enough, and most of the problems seems to just be from poor implementation of existing random n

      • Re: (Score:3, Informative)

        by pem ( 1013437 )
        A random number is for generating the key, not using it.
      • Re:Random Number (Score:5, Interesting)

        by rgbatduke ( 1231380 ) <.ude.ekud.yhp. .ta. .bgr.> on Thursday August 10, 2017 @08:25AM (#54982587) Homepage

        There are three meanings of the word "random" referring to a generator in this context:

        a) Unpredictable.
        b) Empirically satisfying all of the decorrelation properties of a random number sequence -- on average uniform in all bit patterns, on average lacking correlations at all lags (and hence non-periodic) and on all N-dimensional hyperplanes for all N, etc.
        c) Both.

        All that is asserted here is that they have a thermal noise generator that satisfies a). Big whoop -- thermal noise generators (and hardware generators in general) are commonplace: https://en.wikipedia.org/wiki/... [wikipedia.org]. However, thermal noise and so on are often "colored" or "biased" -- they produce fluctuations that are unpredictable but it is almost impossible to get the noise to produce a string of e.g. 0's and 1's that satisfy b). One then is stuck using the unpredictable noise to randomize a pseudorandom number generator (for example, by xor'ing the two together) that produces a bit string that has the right uniformity and decorrelation properties but does so from an internal state that, if known, makes the string produced predictable.

        AGAIN this sort of thing is pretty commonplace. Sources of "entropy" as in unpredictable activity are common enough and so are high quality pseudorandom number generators. The major problem then is rate. Few of the hardware generators can produce entropy FAST ENOUGH to keep up with a PRNG, so getting a source of "true random numbers" that is fast enough to use in e.g. Monte Carlo is not easy, and most people don't bother to try. Having a handful of unpredictable numbers suffices for e.g. encryption and that's really where this is headed.

        I would wax poetic on the fact that EVEN thermal noise is probably not truly random; it is random the way a coin flip is random or, for that matter, the way a PRNG is random. The outcome of a coin flip is unpredictable only because we don't flip the coin with a precise knowledge of its state and the state of the flip environment and because we perhaps cannot integrate its equations of motion precisely enough from what knowledge we do have, but it is deterministic, hence not really unpredictable. Classical thermal noise is no different than a bunch of flipping coins bouncing around -- again deterministic but with lots of unknown state information. "True" random is a term that should probably be provisionally reserved to AT BEST quantum "coin flips", although in the master equation approach to resolving the state of a quantum "coin", the true origin of randomness is seen STILL to be the process of taking the trace of the surrounding environment, which is if you like the filter resolving the flip. That trace introduces "entropy" in the form of lost phase information and averaging over energy distributions that appears as unpredictability in the outcome, but if one views the "coin" AND the surroundings as a single quantum system its quantum trajectory is again deterministic. Randomness in quantum filtering experiments comes from the fact that the measuring apparatus that does the filtering must resolve it in a classical was with its quantum entangling and phases in general unknown and averaged over.

        If one buys the holographic model in string theory (or plain old quantum theory as it is currently structured) the Universe is in a zero entropy state and there are no sources of "real" entropy. In this case there can BE no "true" random number generators. Whether or not nature is capable of generating true random numbers from some source other than our ignorance of state is an open empirical question.

        • by Myrdos ( 5031049 )

          In this case there can BE no "true" random number generators. Whether or not nature is capable of generating true random numbers from some source other than our ignorance of state is an open empirical question.

          If we can prove there's no way to know the entire state, say via Heisenberg uncertainty principle, then there is no functional difference between 'true' randomness and randomness-through-ignorance. I'm not sure if the question has any meaning.

          • There'a a difference, but you can't tell.

            The difference becomes relevant in 'quantum secure entropy extractors' which are designed to be secure against 'maximally entangled adversaries'. I.E. Something that does know the state. With non determinism in the universe, quantum secure extractors can be built. In a deterministic universe, they cannot.

        • Random has one true definition: Without cause.

          As far as we know, it's impossible for true randomness to exist. Our Universe runs on causality. There are things we don't yet fully understand, and things that appear to operate in an unpredictable way, but there's no actual evidence that they do so in violation of causality, no evidence of the Universe being a simulation, no evidence of there being infinitely many Universes, etc.

          If something has a cause, then it can be predicted and patterns can be identified

          • If something has a cause, then it can be predicted and patterns can be identified. It is not random.

            Except that there are phenomena that cannot be predicted. Even ones we know are not really random. Then there are quantum phenomena that MAY be really random. You are making a religious statement when you assert Universal causality as a definite truth instead of a conditional probability. And lack of evidence is not evidence of lack -- the best you can say is that it might make something less probable wit

        • Heck we used to use common Zenor Diodes to generate noise. Hardly new.

        • You... I like you.

          Throughout history, we have thought many things were random. In some cases, we decided that those acts had been done by gods. I am not sure random even exists. Traditionally, it has just meant poorly understood, even though we worded it differently. Bell's Theorum can get fucked.

          • Well, not "get fucked", surely...

            Ultimately it is an empirical question but it is an unusual one. The problem with looking only at locality in a single direction of time flow is that the underlying microdynamic propagators are (without exception as far as I know) reversible in time. My favorite example of this is in classical electrodynamics, where we CHOOSE to use retarded propagators, but where one can equally well formulate things in terms of advanced propagators and where Dirac did an amazing job of d

  • by TeknoHog ( 164938 ) on Thursday August 10, 2017 @05:11AM (#54981853) Homepage Journal
    Making a RNG from inverters is an old trick (shameless plug [github.com]). So if there's any news here, it's making an inverter from nanotubes?
    • Yeah. Back in the old days (the 80's) we used an oscillator that had a REALLY cheap capacitor at its heart. It had a free running bitstream as input to a shift register.
    • by Anonymous Coward

      Netflix using nanotubes...

      Texting using nanotubes...

      Commuting to work using nanotubes...

      Vaping using nanotubes...

      A guitar amplifier using nanotubes...

      Encryption using nanotubes...

      Posting anonymously on Slashdot using nanotubes...

    • by skids ( 119237 )

      Well, they might exhibit less/different interaction with environmental factors so there's less opportunity for a side-channel attack to make them spit out predicatbly, or be able to generate more bits faster, or just be compatible with the rest of a chip made mostly out of nanotubes. But yeah, until I RTFA, I suspect it's just "BECAUSE... NANOTUBES!"

  • Summary fail (Score:5, Informative)

    by Wdi ( 142463 ) on Thursday August 10, 2017 @05:19AM (#54981867)

    The random generator passed only 9 of 15 standard randomness tests of NIST. Not surprising - it is unlikely that the two inverter branches are identical to the atom level, and that is a prerequisite that the thermal noise has exactly equal chance of flipping either branch.

    • Re:Summary fail (Score:5, Informative)

      by TeknoHog ( 164938 ) on Thursday August 10, 2017 @06:00AM (#54981959) Homepage Journal

      The random generator passed only 9 of 15 standard randomness tests of NIST. Not surprising - it is unlikely that the two inverter branches are identical to the atom level, and that is a prerequisite that the thermal noise has exactly equal chance of flipping either branch.

      The NIST tests aren't necessarily that great for judging randomness. For example, too long streaks of ones or zeros will fail the test, even though they are possible in genuine random sources. I imagine one could devise an algorithmic, repeating stream of numbers that passes the NIST tests.

      The issue of unequal chance for 0 and 1 is common in HWRNGs, and there are simple solutions for debiasing the output. https://en.wikipedia.org/wiki/... [wikipedia.org]

      • by Togden ( 4914473 ) on Thursday August 10, 2017 @06:10AM (#54981993)
        If it were truly random, surely it would pass the tests at random, and so really we should be checking that it passes different tests each time, except on occasion when it doesn't.
      • by hord ( 5016115 )

        In a system where you have individual components that contribute to an overall distribution, the way you analyze it is through statistics. The reason why NIST rejects long strings of 0's and 1's is because in any system where these two values actually do flip on a regular basis, the statistics of getting long strands of a single digit rapidly drop to 0%. A "real" random number generator has to have cycles in it even if you can't predict them because the way in which we analyze and use them presume the cyc

      • by Anonymous Coward

        There are several techniques for reducing bias and correlation, often called "whitening" algorithms

        I feel offended but I'm not sure why. Hang on, let me go ask tumblr.

      • Unequal number of 0's and 1's is easily solved by dividing by 2.

        I believe the use of thermal noise and dividing by 2 was well known during WW2. (Where "well known" means by people whose job required them to know, and who were unlikely to be killed for knowing too much).

        I learned about it in the early 60's - it was claimed to be a good use for Zener diodes when they were invented - as the thermionic diodes previously used for this application made a lot of heat and kept dying. You still needed thermionic v

    • Was the output processed for bias? That might explain it. Creating a true RNG is trivial; simply reverse-biasing a PN junction in a transistor will create a good source of avalanche noise which can be converted into a bitstream. It has to be processed to account for bias though, like with the Von Neumann algorithm: https://mathoverflow.net/quest... [mathoverflow.net]

    • You can fix this by generating enough random numbers, apply appropriate hash function, and use that as input on a good cryptographically secure pseudo random generator.

    • by AmiMoJo ( 196126 )

      The only thing interesting seems to be that they printed the circuit, thus making it suitable for use on flexible electronics. Hard to imagine an application where you would need a good RNG on a flexible circuit, but maybe one exists.

      As it happens generating random numbers that pass the NIST tests isn't particularly difficult. Here's some code I wrote that passes all their tests, as well as Diehard and a few others I found: https://github.com/kuro68k/xrn... [github.com]

      • Will that code pass the Dieharder tests [duke.edu]? That was not meant to be snarky but was meant to be something for consideration.
        • by AmiMoJo ( 196126 )

          I'll give Dieharder a try when I get time. I added the results from the NIST tests to the repo in the mean time.

          If it passes NIST and Diehard I'd expect it to pass Dieharder, but it's worth checking.

          • I'll give Dieharder a try when I get time. I added the results from the NIST tests to the repo in the mean time.

            If it passes NIST and Diehard I'd expect it to pass Dieharder, but it's worth checking.

            Try it sometime. It will fail Dieharder. Not because it's bad, but because perfect data will fail. The output from dieharder -l tells you that the OPSO, OQSO, DNA and SUMS tests are suspect or bad and indeed they fail often over good data. Also the default confidence limits mean you are likely to hit a suspect or fail regardless of the quality of the data.

            The updated SP800-22rev1a tests are ok, but not if you use the NIST STS-2.1.2 software. The coefficients for the overlapping template matching test are si

            • by AmiMoJo ( 196126 )

              You were right: https://pastebin.com/PnzK3Pvx [pastebin.com]

              Looks like I'm going to need to generate a lot more numbers to really get much out of that test. I'll grab the hardware and give it a try, as well as the other software tools you linked.

            • by AmiMoJo ( 196126 )
              Okay.

              Symbol Size(bits) = 1
              Shannon IID Entropy = 1.000000 bits per symbol
              Optimal compression would compress by 0.000000 percent
              Chi square: symbol count=1677721593, distribution=0.92, randomly exceeds 33.86 percent of the time
              Mean = 0.499988
              Monte Carlo value for Pi is 3.141438 (error 0.00 percent).
              Serial Correlation = 0.000056

              Your python code allocated 6GB of RAM, ran for a bit and was killed. I tried it with a smaller
              • User testing! Yay!

                "ImportError: No module named scipy.special" is python's way of saying please install scipy. You can do that. I'll throw in my own implementation of the incomplete gamma function to break that library dependence when I work out the equation.

                I can't help with your lack of RAM, but how large were the files? Some of the NIST tests increase their memory usage with data size, but NIST specify data sizes to be used with the tests in SP800-22Rev1a.

                My problems with the NIST STS-2.1.2 code are:
                A) O

              • I've just push changes to remove the dependency on scipy. Feel free to pull them.

                Implementing the gamma functions is how I prefer to spend my weekends.

                • by AmiMoJo ( 196126 )

                  I'll give it a try, and post any issues on Github. Thanks.

                  • Note that the NIST spec says to test with 1 Mibibit/128Kibytes of data. The tests run in reasonable time with data that size.

                    I'm working on some better tests that are more reliable and computationally more bounded. We shall see if I succeed.

                    • by AmiMoJo ( 196126 )

                      I need to port some code to libusb so I can pipe the data into Dieharder anyway. I tried creating a 2GB file and it still rewound it hundreds of times, and it actually failed tests that the smaller 100MB files passed. Something needs further investigation here.

                      The NIST code seems to cope with 100MB okay, takes several minutes to run. Ent seems to have some issues with files that size but ultimately does produce some useful output. I'm tempted to do some work on Ent to support larger files and some more test

                    • Dieharder needs on the order of a TB of data to avoid rewinding. BigCrunch is worse.

                      >it would be great if users could just buy one off-the-shelf and get a reliable TRNG with minimal effort and cost.
                      Someone should ask ARM why they don't provide on in their CPUs.

                    • by AmiMoJo ( 196126 )

                      Some ARM variants do have hardware RNG, although I think it's a manufacturer extension rather than a part of the ABI. I was actually thinking of going that route with an Atmel ARM that has USB 2.0 high speed and a hardware RNG, which they claim "passes" NIST tests and Diehard.

                      The built in RNG is a little slow though so I'd be using the same techniques I am on XMEGA to generate more entrophy, which is basically to use two ADCs to measure thermal noise in the on-board temperature sensor and discard all but th

        • Probably not. Dieharder implements distinguishability tests for PRNG algorithms. For an entropy source you want min-entropy estimation algorithms.

          The primary source of these algorithms is the SP800-90B draft spec. Unfortunately they don't work well. I submitted public comment to NIST detailing the failures of the tests against simple cases of biased and correlated data. https://github.com/dj-on-githu... [github.com].

    • 7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7,7

      For all you know, that's completely random.

      Your comment violated the "postercomment" compression filter. Try less whitespace and/or less repetition.
      Your comment violated the "postercomment" compression filter. Try less whitespace and/or less repetition.

  • by Opportunist ( 166417 ) on Thursday August 10, 2017 @05:54AM (#54981939)

    The first test of a good random number generator is obviously whether it can generate a true random number under normal operation conditions. This they claim to have accomplished.

    The second test is just as critical and I'd be very interested in the result: Can any kind of manipulation be easily detected? Or is it possible to tamper with the device in such a way that it does generate a number predetermined by the manipulator without anyone else being able to determine that such manipulation took place?

  • by little1973 ( 467075 ) on Thursday August 10, 2017 @05:58AM (#54981953)

    but it may not from the Universe' point of view.

    According to Gerard 't Hooft, the superdeterminism loophole cannot be dismissed.

    The Free-Will Postulate in Quantum Mechanics
    https://arxiv.org/abs/quant-ph... [arxiv.org]

    Entangled quantum states in a local deterministic theory
    https://arxiv.org/abs/0908.340... [arxiv.org]

    • by hord ( 5016115 )

      Random just means that you have inputs for which you can't measure their history. If we truly believe in a deterministic universe (as physics does), then there is no random. There is only entropy and your local evaluation of it.

  • OK, so, it's generating a series of truly random 0s and 1s. I don't have access to the article, but my question is if this truly random number generator has been identified as being a part of some stochastic process, like a binomial or poisson process? Would appreciate some more insight on this.

    • I expect (given my job makes me quite well prepared to answer) that the output would be statistically non stationary and so not fit nicely in a binomial or poisson distribution.

      The question that needs answering is "What is the min-entropy of the data from this source".

      Another question is "What is the distribution of quality of entropy across a large population of these devices". It seems likely that a large proportion would not work when built due to intrinsic device variation.

      They answers might be in the p

  • Creating hardware RNGs is pretty trivial with off-the-shelf electronics; people have been using reverse-biased PN junctions on transistors for this application since forever.

    • You can also use a $10 webcam, cover the lens, and turn the gain up until it starts producing noise. Compensate for outside interference by calculating delta between two disjoint sets of random pixels.

  • At the voltage-level you get, very roughly half of the noise is quantum and "true" random (which is just Physic-speech for "we have no idea how it works"). Amplify, digitize, pipe into a randomness-pool and you are done. Can be accomplished for $20 or so in parts.

    Or you can use a Zener Diode, and some RF amplifiers: https://www.maximintegrated.co... [maximintegrated.com]
    Spectrum here goes well over 100MHz.

  • Nice save (Score:5, Funny)

    by physicsphairy ( 720718 ) on Thursday August 10, 2017 @06:29AM (#54982063)

    Researcher 1: "Our nanotube project is outputting completely garbage data. I guess this means we can't publish."
    Researcher 2: "Or... can we?"

  • although some things are sufficiently unpredictable as to be "close enough". Thermal noise, as this method is using, usually falls into this category.

    I personally prefer algorithmic methods of generating random numbers. Sufficiently designed functions can perform well on random analysis while still offering you the option of fixed seeding for those cases where you need a consistent stream. (mainly used for testing and cryptography)

    • by JBMcB ( 73720 )

      Are you talking about the difference between white noise (true randomness) and pink noise (evenly distributed randomness)?

      • v1 is is talking about a preferring a deterministic RNG vs. a partially entropy entropy source.

        A properly engineered RNG for crypto needs non determinism from an entropy source and needs uniform, full entropy data by running it through an entropy extractor. The 'algorithmic method' v1 mentions is a PRNG. It's optional and is used to increase performance by generating many outputs for each seed input.

      • by v1 ( 525388 )

        The only "perfect, unbreakable crypto" is the "one-time-pad", which requires both the sender and the receiver to have a truly (or sufficiently) random stream of numbers to use as a pad/xor. The limitations of this method are that (A) each pad can only be used once, (B) both parties need a sufficiently large amount of pad for their messages, (C) when they run out of pad, they have to get together somehow securely to exchange more large padding, and (D) pads are totally impractical to memorize.

        Seedable rando

  • int rand(){ // I rolled a 6 sided die to get a random number.
          return 3;
    }

  • Just post something snarky and wait for it to get modded a mix of: funny, troll and informative ...

  • by TechyImmigrant ( 175943 ) on Thursday August 10, 2017 @11:05AM (#54983623) Homepage Journal

    When your full time job is designing RNGs, Reading articles on RNGs can be a little painful.

    The term "TRNG" (True Random Number Generator) is a poorly defined thing. I think people think it means 'ideal non deterministic' but it's never used in that context and in this case we certainly don't have such a thing.

    The thing they designed is a an "entropy source". It produces partially entropic nondeterministic data.

    The chain of events in an RNG is..

    Entropy source --> Online Test --> Entropy Extractor --> (If needed for performance) A CS-PRNG. (crypographically secure pseudo random number generator).

    Entropy source : Makes partially entropic data. It doesn't matter what kind of source it is, whether quantum, lava lamp, circuit or whatever else, you never get perfect entropy from a physical process. The entropy extractor distills this kind of data into a smaller amount of data that is close to full entropy. 'Close' is mathematically described in terms that matter in cryptography.

    Online Test: Continuously checks the ES is working while it's running. -- Top tip - This is the hard bit in RNG design.

    So unless they can build and online test an entropy extractor in carbon nanotubes, they don't have a solution but they do have an entropy source. I don't know if they have done this or not, because the link in TFA doesn't work, despite my corporate IEEE account. If they have, then well done. If not, it's interesting anyway, but not ready for application.

  • If this can be effectively commercialized, it would be a game-changer (no pun intended).

  • Over 10 years ago someone invented a PCI card that splits photons left or right over some kind of quantum thing and it's provably flawlessly random. Why is someone bothering to try and outdo that?
    • Well, for one thing, does your phone have a PCI slot? Is the technology you reference usable, in a practical way, in small technology; that is to say, small enough, inexpensive enough, easy enough to manufacture in bulk, no excessive power requirements, and so on?
  • Is a reverse biased pn junction a random noise source? Is this better? In what way?

Genius is ten percent inspiration and fifty percent capital gains.

Working...