Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science Technology

New Quantum Computing Record Set By Recycled Photons 47

CelestialScience writes "A recycling technique has enabled a quantum computer to carry out a quantum calculation known as Shor's algorithm on a larger number than ever before. The benchmark algorithm exploits quantum mechanics to simplify the factorization of numbers into their prime components — a hard task for classical computers when the numbers get large. Until now, the largest number factorized using Shor's algorithm was 15. Now Anthony Laing at the University of Bristol, UK and colleagues report in Nature Photonics that they used a recycled photon to factorize 21 — still far too small and trivial to spook cryptographers, who rely on the difficulty of factorizing large numbers for their widely-used techniques. But a record nonetheless."
This discussion has been archived. No new comments can be posted.

New Quantum Computing Record Set By Recycled Photons

Comments Filter:
  • 7*3 (Score:3, Funny)

    by Anonymous Coward on Tuesday October 23, 2012 @06:04PM (#41745849)

    7*3. Nailed it!

  • by Anonymous Coward on Tuesday October 23, 2012 @06:05PM (#41745855)

    Seen this before.

  • This stuff scales incredibly bad with time. Not even a hint of a "Moore's law" here. By now I doubt they will be able to factor 1000 before the end of the decade.

    • by geekoid ( 135745 )

      Moore's law wasn't applicable at the beginning of transistor on wafer creation. once tools where in place, then it started to run it's course.

      • by gweihir ( 88907 )

        The effort invested in the first transistors and in this do not compare at all. Transistors were initially though to be useless. Turned out to be fantastically wrong. The converse is true for quantum computing.

        • First, bullshit, the people who actively developed working transistor devices -- specifically Bell Labs -- knew damn well that they were useful as a replacement for vacuum tubes. Maybe you're thinking of lasers?

          Second, regardless of the initial R&D development effort, the Moore's observation didn't apply until after the transistor left the lab and was in full modern production and so, if it ever is, will it be with quantum computers.

          Complaining about the lack of exponential growth now is just ridiculou

          • by gweihir ( 88907 )

            You are thinking of bipolar transistors. FETs were known a lot longer and are what makes modern electronics tick.

        • by khallow ( 566160 )
          Transisters were a big step in the transition from unreliable and bulky vacuum tubes to integrated circuits. I would say that Moore's law was already in gear by the time of their development.

          As I see it, the dynamic of Moore's Law was threefold. First, it provided a simple model of how fast one should be developing integrated circuit technology. Second, there were plenty of zeros to run out Moore's Law for decades because as Feynman noted, "there's a lot of room at the bottom." And third, Moore's law sup
        • by mcgrew ( 92797 ) *

          Transistors were initially though to be useless.

          Interesting, I hadn't heard that before, and I've been tinkering with electronics for over 45 years and have read hundreds of book about it. I looked up in Wikipedia (quoted below) and found no such indication. Do you have a link? As I said, I found that intersting and would like to know more.

          The thermionic triode, a vacuum tube invented in 1907, propelled the electronics age forward, enabling amplified radio technology and long-distance telephony. The triode,

          • As I said, I found that intersting and would like to know more.

            Here's more: They made that up as a shoddy explanation for why exponential growth didn't hit transistor-based ICs immediately to fallaciously justify their irrational expectation that quantum computers should immediately experience exponential growth. Every step of the logical chain is wrong even if you assume the previous step was correct, and the starting fact was a lie.

            And now you know the whole story.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Even if it were more advanced now, it still wouldn't be much of a danger to cryptography.
      There are enough encryption algorithms where quantum computing is not a danger.
      Sure, algos using the dlog or factorisation problem would fall instantly, but something like McEliece would finally thrive.

    • You sound like the techies on the internet in the 90s that laughed at the idea that we'd EVER have gigabytes of RAM in a single computer, let alone a few years later.
    • by jovius ( 974690 )

      Quantum computing is a new and unexplored field of cats, so I'd say that the cat images count too.

    • by hweimer ( 709734 )

      This stuff scales incredibly bad with time. Not even a hint of a "Moore's law" here.

      There is. [quantenblog.net]

    • The thing with Moore's law, it's a doubling time. If the number you're doubling is very, very small doubling every 6 years (which is the actual doubling time for quantum computing so far) is not going to sound very impressive. But, if we have 5 qbits today, by the end of the decade we should have around 15 by the end of the decade and be able to factor numbers in the low tens of thousands. 12 years after that and you can factor numbers in the quadrillions. 12 years after that and you can factor numbers

  • by BitterOak ( 537666 ) on Tuesday October 23, 2012 @06:10PM (#41745903)
    Are photons that expensive that they need to be recycled? I can understand aluminum cans, but photons are taking it a bit far, I think.
    • For each photon recycled you get a one electron credit on your electric bill.
    • by sconeu ( 64226 )

      It's good for the photon, it's good for the quantum computer.

    • by Anonymous Coward

      As best I can guess from reading the article, the issue is that it is difficult to isolate the photon so that it can be used in the experiment. If they did not recycle it, they'd have to capture more. By running the experiments in series with one photon rather than in parallel with multiple photons, they make the experiment take longer to complete but leave it easier to perform. This allows them to investigate qubit intensive algorithms now rather than wait until they figure out a better method to build

    • by hicksw ( 716194 )

      One with the energy equivalent to a Higgs Boson would be very exciting indeed. Maybe.
      --
      Strong ethanol force tonight the is..

  • Now we can have lectures about the sustainability of quantum computing and quantum computing goes green on an attempt to save the planet?

  • by Anonymous Coward

    For all non-even numbers below 25, it's either prime or divisible by 3 (and since 25 is square, then for all numbers before 35 really)
    As such, a quantum computer these days may as well always set the bottom bit in the answer to 1, and alternate randomly the second bit.

  • This won't impress the Babylonians until we get to 60 + 17.

  • by Anonymous Coward

    15 is between 2^3 and 2^4. 21 is between 2^4 and 2^5. Next stop, 51!

  • by JoshuaZ ( 1134087 ) on Tuesday October 23, 2012 @07:34PM (#41746507) Homepage
    Note that quantum computers have already been used to factor larger numbers. As TFA discusses and this preprint http://arxiv.org/abs/1111.3726 [arxiv.org] from about a year ago reports, there has been success factoring 143. But they didn't use Shore's algorithm but rather used an adiabatic algorithm http://en.wikipedia.org/wiki/Adiabatic_quantum_computation [wikipedia.org]. TFA makes a slightly incorrect claim that the adiabatic quantum algorithm "unlike Shore's algorithm, is not mathematically guaranteed to provide faster performance for larger numbers." This is misleading: Shore's is known to provide a polynomial time solution to factoring, but this is only known to be faster than the best known classical algorithms. In this context, we still can't prove that factoring is hard in the sense of taking more than polynomial time on a classical computer. Such a result is strictly stronger than P != NP http://en.wikipedia.org/wiki/P_versus_NP_problem [wikipedia.org] which is one of the biggest unsolved problems of mathematics today.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...