A Quantum Linear Equation Solver 171
joe writes "Aram Harrow and colleagues have just published on the arXiv a quantum algorithm for solving systems of linear equations (paper, PDF). Until now, the only quantum algorithms of practical consequence have been Shor's algorithm for prime factoring, and Feynman-inspired quantum simulation algorithms. All other algorithms either solve problems with no known practical applications, or produce only a polynomial speedup versus classical algorithms. Harrow et. al.'s algorithm provides an exponential speedup over the best-known classical algorithms. Since solving linear equations is such a common task in computational science and engineering, this algorithm makes many more important problems that currently use thousands of hours of CPU time on supercomputers amenable to significant quantum speedup. Now we just need a large-scale quantum computer. Hurry up, guys!"
Hm (Score:3, Funny)
I'll wait until I can program in VB. Will it take long?
Re:Hm (Score:5, Funny)
Is this algorithm in Haskell or somethin'?
I'll wait until I can program in VB. Will it take long?
It may or may not.
Re: (Score:2, Funny)
Also, it may AND may not.
Re: (Score:3, Funny)
Following the protocol of quantum physicists to be un-understandable by anyone but the top people in their field and 13-year-olds with too much time who watch NOVA and read Popular Science, they wrote it in Perl
Re: (Score:2)
Following the protocol of quantum physicists to be un-understandable...
"Derstandable"!?
That word is incomprehensible to me. I am unable to understand it, is what I'm trying to get at...y'know: baffling, beyond comprehension, clear as mud, cryptic, Delphic, enigmatic, fathomless, Greek, impenetrable, incognizable, inconceivable, inscrutable, mysterious, mystifying, obscure, opaque, perplexing, puzzling, sibylline, unclear, unfathomable, ungraspable, unimaginable, unintelligible, unknowable.
Re: (Score:2)
And used a Brownian Motion generator using a fresh cup of really hot tea..
Re:Hm (Score:5, Funny)
Re: (Score:2)
Sure it might "help" VB but just think of the possibilities with PHP...
$numberClueless = "0";
if(empty($numberClueless)) print("Yay for PHP");
Re: (Score:3, Informative)
No, really!!
Quantum::Superpositions [cpan.org]
n to log(n) (Score:3, Informative)
I'm sure it's still a significant result and there's a good chance they did something new that can be used in other applications.
Re: (Score:3, Insightful)
Often, one does not need to know the solution x itself, but rather an approximation of the expectation value of some operator associated with x, e.g., xMx for some matrix M . In this case, when A is sparse and well-conditioned, with largest dimension n, the best classical algorithms can ïnd x and estimate xMx in O(n) time. Here, we exhibit a quantum algorithm for solving linear sets of equations that runs in O(log n) time, an exponential improvement over the best classical algorithm.
This is a long quote, which is at top of the actual paper cited (I'd trim it down, but I'd need to brush op on my linear algebra to be sure not to ruin it).
According to them, the best algorithms are O(n) and their algorithm improves that to O(log n). It has nothing to do with P vs NP but it is an exponential improvement anyway (going from n to log n), as promised in the summary.
Re: (Score:2)
Re: (Score:2)
As for expansion, well, let's just hope there's a similar speedup to be found in linear programming :)
Simplex method forever, Baby!
Re: (Score:2)
A Slashdot summary that is still technically correct, despite the usual omission of key facts?
I think your discovery is perhaps even more notable than the algorithm!
Re: (Score:2)
> I think you'd need a ginormous matrix for this to be useful
There are ginormous matrices. For example finite element analysis often uses matrices with millions of rows.
Re: (Score:2)
Analysis of pollution monitoring stations, using historical wind flow analysis to determine where in the world pollution has been released from.
ABSOLUTELY ginormous matrices are used for this.
Practical Application?! (Score:2)
Yes, because all quantum algorithms are hugely practical these days...
Re: (Score:2)
You have your word semantics wrong. Shor's algorithm has enormous practical application. However, as there are no computers to run it, it's impractical to actually put it to use.
arXiv articles - question (Score:3, Informative)
Are arXiv articles peer-reviewed?
If not (as I suspect), that puts a serious question mark on them. On the other hadn, there are excellent non-peer reviewed scientific articles - and almost all the scientific articles produced in Europe before the 2nd WW were not peer-reviewed (back then that was an american practice, and a very good one I would add). However, nowadays peer review is a valuable and available resource that should be utilized.
THAT SAID again... some of the best, most innovative scientific articles are not being, nowadays, accepted for publication because the reviewers are one degree too dumb for the article. Eventually they do get published, but with an unnecessary 5-6 year delay.
Also, I have read in my life thousands of published peer-reviewed articles, and many of them contain incredible imbecillities where I have to question whether all the reviewers were high on hallucinogens. Very very high on hallucinogens.
Re: (Score:2)
I thought it had been nearly proven that peer review for highly specialized complez subjects was pretty much worthless, since most reviewers will not understand the subject matter and also won't be willing to admit that they don't understand it. Didn't some students at MIT create a giberish generating program that was able to produce papers that pass peer review?
The basic problem is, for truly ground breaking research, there often isn't a ready supply of peers to do the review. There are plenty of subject
Re: (Score:2)
Didn't some students at MIT create a giberish generating program that was able to produce papers that pass peer review?
I doubt it, unless they were sociologists :)
Peer review may not be able to reliably tell for certain if something is correct, but it's often a good mechanism for spotting things that are wrong.
You don't need ten years post-doc work in QCD to spot an error in addition, though if you're trying to extend the number of colours in a gauge theory you probably do :)
Re: (Score:2)
Didn't some students at MIT create a giberish generating program that was able to produce papers that pass peer review?
I don't know. But sure as heck would like to - any more info on this so I can look it up?
Re: (Score:3, Interesting)
Yes and no. It was a program called SciGen. The purpose was to weed out conferences that are crap. The ones that exist to take, and make money, rather than really promote scholarly work.
---
About:
SCIgen is a program that generates random Computer Science research papers, including graphs, figures, and citations. It uses a hand-written context-free grammar to form all elements of the papers. Our aim here is to maximize amusement, rather than coherence.
One useful purpose for such a program is to auto-genera
Re: (Score:2)
I read about the WMSCI 2005 fiasco. That's not really a merit of the paper generated by SCIgen, I think, but the fact that WMSCI 2005 did not actually review the submitted papers. Correct me if I'm wrong. In any case, WMSCI 2005 comes out as total losertown.
I briefly read through the WMSCI 2009 call-for-papers, and they now seem to have a double-boind review, and something strange: "a non-blind, open reviewing by means of 1-3 reviewers suggested by the submitting authors." I can see potential for
Re:arXiv articles - question (Score:4, Funny)
Are arXiv articles peer-reviewed?
No, they aren't [arxiv.org].
Re: (Score:2)
Re: (Score:2)
arXiv holds preprints, so no -- they've not been peer-reviewed.
However it has moderation and endorsement systems which (in theory) spot any Archimedes Plutonium wannabes.
Re: (Score:2)
Peer-review isn't always what it's cracked up to be. Read about the Bogdanov controversy that erupted in my uni some years back that exposes some serious flaws in the peer-review process.
http://en.wikipedia.org/wiki/Bogdanov_Affair [wikipedia.org]
Re: (Score:2)
Peer-review isn't always what it's cracked up to be. Read about the Bogdanov controversy that erupted in my uni some years back that exposes some serious flaws in the peer-review process.
http://en.wikipedia.org/wiki/Bogdanov_Affair [wikipedia.org]
Or read about the ongoing El Naschie affair:
http://golem.ph.utexas.edu/category/2008/11/the_case_of_m_s_el_naschie.html [utexas.edu]
Where are mod points when you need them? (Score:2)
Also, I have read in my life thousands of published peer-reviewed articles, and many of them contain incredible imbecillities where I have to question whether all the reviewers were high on hallucinogens. Very very high on hallucinogens.
Could someone mod the parent up +1 whatever [insightful, informative, funny - really, a "tragicomic" mod point would be most appropriate here].
Back in the day, Goro Shimura used to say that something like 50% of all published mathematics is simply rank nonsense, but the
Re: (Score:2)
I don't read maths articles. My field of research is in nanosciences, materials science, physical chemistry etc. etc. (chemistry is really quite important for my research on multiple levels). In the kinds of articles I read, the nonsense is easy to spot - for me. I'm quite pedantic (or anal, if you prefer) when it comes to scientific rigour.
I take it you're a maths researcher? I just had a nice mini-thread/discussion with a maths grad student, he tells me it takes quite a long time to read a maths article.
Quis custodiet ipsos custodes? (Score:2)
I think reviewers should be held accountable in some way.
Quis custodiet ipsos custodes? [wikipedia.org]
The truth of the matter is that the problem is inherently intractable.
If you have good people writing papers, then you will get good papers.
If you have lousy people writing papers, then you will get lousy papers.
What people need to realize [and what almost all people in the grant-writing racket lack the strength of character to admit] is that most published "research" is simply false.
And if to "false" you wer
Re: (Score:2)
By grant-writing racket, do you mean grant-application-writing racket, or grant-granting (uh, what's the verb, help me out..) racket? Two separate sets of people. That said, I agree with the general idea of your post. Though I strongly disagree that almost all published research is either false or "trivially tautological or essentially meaningless". There's plenty of good stuff out there. At least in natural sciences.
C'mon now - "almost all"? You didn't really mean it, did you?
Re: (Score:2)
Wow, you've read thousands of published peer-review articles, but you don't know whether or not arxiv articles are peer-reviewed?
Colour me skeptical.
I have never read a single arXiv article. I've seen a few titles and abstracts, though.
Re: (Score:2)
I am calling bullshit on this. I am calling bullshit on this. I am currently in graduate school at a major university. There are several world famous professors here and one in particular is known for his ability to sift through papers extremely quickly. He doesn't read them, just skims to get the basic idea. He only gets through a paper a week maybe two. Assuming you are more consistent than him, and read two per week. And being more generous you blaze through 100 a year. You would be working at this pace for 20 years. While not impossible, highly unlikely. Especially since you mentioned that you "read" them not skim, and furthermore you are able to check them for trivialities, which takes considerable more time considering that you would have to evaluate the status of the paper.
And if I was thoughtless and narrow-minded like you, I'd call bullshit on what you just wrote there, because my experience is completely different. I read (not just skim through) one to two papers a day on a normal day, and more when I'm writing. I read more than a hundred papers when I wrote my review. I read (and work in) the field of nanotechnology, materials science, (integrated and semiconductor) chemical and other sensors, and chemistry in general is extremely important for my research. The reason I d
Re: (Score:2)
Apology very much accepted. This behaviour shows honesty and a lack of ego (both important in science... and life).
Now.. are you telling me your PHD work will only have 6 references? That's, again, almost incredible to me. A colleague of mine and I have come up with a rough number of references per PHD: it's about the number of pages, including appendices. In your field, it seems it's about the tenth or less than that. Which explains the one-paper-per-week or less. I love maths, but I am glad I'm where I am
Looks like a big deal to me. (Score:5, Informative)
Finally a cool article on /. This is extremely cool! There are a lot of problems in the real world that have extremely large sparse matrices that need to be inverted. Fluid dynamics and solutions to Maxwells equations come to mind. But I am sure there are other applications in relativity and plasma physics. Estimating a solution to a linear dynamic system of say 2^128 degrees of freedom in only 128 cycles would change a lot of things.
And... Yes, we [uibk.ac.at] are working very hard on building the computers.
Avinatan Hassidim and Seth Lloyd (Score:5, Informative)
So please cite this as "Harrow, Hassidim and Lloyd" and not "Harrow and coauthors."
That said, we're pleased about that attention. :)
In response to one question: the matrix inversion problem for the parameters we consider is (almost certainly) not NP-complete, but instead is BQP-complete, meaning that it is as difficult as any other problem that quantum computers can solve. We plan to update our arXiv paper shortly with an explanation of this point.
Re: (Score:2, Insightful)
I work in CFD, so this is all thrilling to me. I suspected it was only a matter of time before methods were discovered for applying quantum computation to large systems of linear equations, and I certainly hope your work stands up to peer review. Cheers!
Re: (Score:3)
It's Mike Wolfe, congratulations on making Slashdot, if this is your first time. What a happy surprise seeing your name there!
-Mike
Re: (Score:2)
It's awesome that you posted here :)
A quick question if I may..
Could raytracing be done efficently on quantum computers?
Re: (Score:2)
Re: (Score:2)
Great work, Aram! I can't believe you managed to get your name on slashdot. I seethe with jealousy.
Heck, I'd settle for having a decent paper to put on arXiv!
-Dan Lenski
Re: (Score:2)
I see a new sci-fi channel B-movie in your post...
A quantum linear equation (Score:2)
Progress Bar? (Score:2)
What kind of side effects will a progress bar have since you can't know what its doing if you know where it is?
Other algorithms (Score:2)
A cool algorithm that should be possible on a quantum computer is "perfect" data compression. IOW, "what is the smallest turing machine + input string that outputs the following string in less than 1 billion steps?"
Such an algorithm would need a quantum computer to run, but the decompression could happen on a classical computer.
Anyone aware if such an algorithm exists? The summary would seem to indicate not.
Comparing Apples to Apples (Score:4, Interesting)
This looks like a really interesting result, but having look a little at the paper it's not 100% clear to me what the quantum algorithm is being compared to. First a caveat, I study quantum physics but not CS, so my knowledge of algorithms and complexity theory is quite limited. Anyway, in solving problems on a good old classical computer, you can seek to solve a linear system exactly (up to the bounds of finite arithmetic) or you can seek to solve it approximately to within some error.
So the thing I'm wondering is what classical algorithm are they comparing their result to, and is this comparing apples to apples? They say
So it looks like the authors' algorithm gives an approximate solution to the linear system and they're comparing it to a classical algorithm that gives an exact solution to the problem. This seems like comparing apples to oranges. Perhaps someone who knows more about the features of the various classical algorithms can comment on whether it looks like the correct comparison to make and why.
I bring this up because I recall that given a matrix representing the density matrix of a bipartite quantum system, determining whether it represents an entangled or separable quantum state is in general NP-Hard, but IIRC there exist semi-definite programming techniques to get the answer probabilistically in polynomial time. The point is that in that case there's a big gain for accepting an answer that will be wrong every once in a while. I was just curious if settling for an approximate answer in solving linear systems changes the complexity of that problem drastically as well.
Re: (Score:2)
Of course any quantum computing algorithm (including Shor's) has the same issue.
In practice, once the probability of the answer being incorrect is smaller than some threshold (e.g. the probability of the classical computer giving a wrong answer due to cosmic rays) it really doesn't matter that you're using a probabilistic algorithm.
Re: (Score:2)
It's true that something like Shor's algorithm is probabilistic, and so, yes, you're always sort of comparing apples to oranges. So in that situation you can define a probability distribution on the amount of time it will take to get the correct answer. To compare apples to apples you simply compare some number like the mean time for solution, or the 95th percentile longest time for solution (there's also the question of best- and worst-case scenarios) for the two algorithms. If the classical algorithm
Re: (Score:2)
Let's say the probability is 1/2. The solution to the system can be verified in polynomial time. If the solution is incorrect, run the algorithm again.
After you run it N times, the probability of error is 1/(2^N), and you only wasted polynomial time on verification. Clearly, when your probability of error is smaller than the cardinality of possible outcomes, the algorithm will perform as designed in a sane amount (polynomial) of time.
Re: (Score:2)
I'm not sure the number of possible outcomes has anything to do with it. These quantum algorithms can in principle give the same wrong answer over and over again (though this is extremely improbable), so it's not like you exhaust the set of wrong answers or something. In any case, it is certainly true that you can put a probability distribution on the time it will take the algorithm to get the answer, and you ask how long it will take to get a solution on average or that it will take less than T operation
Re:Comparing Apples to Apples (Score:4, Informative)
Dammit! (Score:2)
Re:Seems bogus (Score:4, Insightful)
Re:Seems bogus (Score:4, Funny)
Re: (Score:2)
It's not completely addressed or at least all the implications are not spelled out. They talk about dealing with sparse matrices but they would have to be O(log n) sparse to be useful for a single application of the algorithm. For example, a matrix with
Re: (Score:3, Informative)
They talk about dealing with sparse matrices but they would have to be O(log n) sparse to be useful for a single application of the algorithm. For example, a matrix with n=1000000 would have to have around 15 non-zero values and a trillion zero values.
Wrong. An upper bound in big O notation [wikipedia.org] does not give any indication as to constant factors of the "real" bound (nor to other summands which are in o(log n)). In fact, your statement doesn't even make sense, because it is an asymptotical bound and hence can't be applied to a fixed size of the input size.
Re: (Score:2)
It's an example, to give a feel for the order of magnitude. If the particular constant was very large, one would have to increase n to show the same effect, but the idea would hold. In addition, it has n
It IS bogus (probably) (Score:4, Interesting)
Re: (Score:2, Interesting)
Basically we need to assume random-access memory. There has been some preliminary work on models for quantum RAM, some of which we cite in our paper. But as a sign that the model isn't too powerful, we k
Re:not able to be used == not useful (Score:5, Insightful)
When the Prime Minister asked him about a new discovery, "What good is it?", Faraday replied, "What good is a newborn baby?"
Re: (Score:2, Troll)
Maybe it's just a story, but I've read something about Faraday that made me think about what you've written.
When the Prime Minister asked him about a new discovery, "What good is it?", Faraday replied,
"What good is a newborn baby?"
Wrong, actually it was in reference to his new electric motor, and the proper quote is 'When asked by the kind what use it was, Faraday replied, 'One day sir, you may tax it'.
Note also at that point the electric motor existed, quantum computers do not, not in any useful sense.
Re:not able to be used == not useful (Score:4, Informative)
Re: (Score:2, Funny)
Re: (Score:3, Insightful)
He's my number 1 for being a genius AND a manwhore.
Re: (Score:2)
"All cats are gray in the dark."
(Franklin on the advantages of older women. He wrote a whole essay on the subject. His basic thesis was that the "juices" -- I think that was his word -- move down in the body as a woman ages...)
Re:not able to be used == not useful (Score:5, Insightful)
You think that quantum computers are not able to justify grants or PhDs? What world are you living in?
Yeah, because classical computers were never useful to anyone (or anyone important) until datacenters existed.
And until then, developments that bring us closer are irrelevant? Applications that could give us more reason to develop the technology are pointless?
What exactly is your point here?
Re: (Score:2)
You think that quantum computers are not able to justify grants or PhDs? What world are you living in?
One where I read peoples comments properly before replying to them. I did not say that at all.
Yeah, because classical computers were never useful to anyone (or anyone important) until datacenters existed.
We use datacentres *now* we didn't use them before. These days people in the real, working world need access to computing power in ways that we didn't before. I am talking practicality. The kinds of applications that get companies investors.
And until then, developments that bring us closer are irrelevant? Applications that could give us more reason to develop the technology are pointless?
What exactly is your point here?
Irrelevant is not the same as useless. It is, 'a lack of relevance', look it up. Academia will continue to explore the issue, but until we see deployments of quantum computers
Re: (Score:3, Insightful)
Eh, in that case your post is irrelevant. It amounts to saying "Until we have x, people won't be able to use x, so it will only be of interest to people studying creating x". It goes without saying, and people on Slashdot are clearly interested in this.
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
I know quantum is the 'new toy' in computer science, but seriously, until these quantum computers exist and are cheap enough to fill datacentres with, no-one outside of academia is going to get any useful work from them.
Er, yes. If there aren't any quantum computers then quantum computers aren't useful. That's not the point of this work. The point of this work is primarily to present a new algorithm for which quantum computing shows exponential speedup, which is an interesting theoretical issue in computer science. If quantum computers ever become practical, then this algorithm will be too, but no one has claimed that this algorithm is useful today. (The submitter noted explicitly that it is not.)
Re: (Score:3, Insightful)
I could just as well argue that there is nothing useful about developing quantum computers until we have programs we can run on them.
This is not tangential science. The is the real groundwork for the development of the new technology. Without this sort of work quantum computers will simply never exist.
So pointing out its lack of present utility is like pointing out that after laying the foundation for a house that you still have nothing to live in. That may be true, but in so far as the initial step
Quantum Computing needs a language (Score:2)
Do you realize, that until they begin solving these problems in academia that there will be no industry-ready solution? I mean this is a whole new advent of computing really with three-dimensional processors coming online at around the same time we may see the same sort of increase in computational power that they did with the first electric computers.
This are some workable challenges that lay ahead in the next few years.
1. Not only will a lot of algorithm design need to be hacked out but I think Quantum
Re: (Score:2)
Re: (Score:2)
Did you know that the foundations of computer science were laid down before the existence of the first electrical computer?
Did you know that, when electrical computers came into existence, having this science all laid out already was actually pretty useful? Kept people from standing around going "Okay now what do we do?" for several years.
In short, your view of utility is short-sighted and lacks vision.
Re:not able to be used == not useful (Score:5, Interesting)
It's not a 'new toy' for computer science. Computer science has pretty much nothing to do with it. Mostly, it's a toy, or rather, academic persuit for theoretical physicists.
No, it's of real interest to theoretical computer science. Quantum computing defines a new class of algorithmic complexity: there are, for instance, sub-exponential quantum algorithms for problems which have only exponential-time classical solutions. There is a whole subindustry of algorithmic complexity theory devoted to exploring the differences between algorithms that can be executed on quantum computers and those which are executed on classical computers. Scott Aaronson's lecture notes [scottaaronson.com] are a good introduction to this subject.
Re: (Score:2)
Re: (Score:2)
So you're basically saying that probabilistic computation isn't math? That doesn't make any sense. Just because we have computers that are extremely reliable doesn't mean that probabilistic methods are unimportant. Quantum computers are showing us that they are.
Richard Feynman would disagree. Please read his work on the Manhattan project.
Re:not able to be used == not useful (Score:4, Informative)
If you think quantum computing is about putting a physical system in a certain state and manipulating it, then you should also think classical computing is about setting a tape in a certain way and manipulating it.
When Turing first described a computer (a turing machine), it was essentially a mechanism where you put a tape in a certain state, manipulate it according to certain rules and in the end you get the tape in another state containing the "result" of your computation. What was so interesting about it it that it was theoretically possible to build a machine that would be able to do the manipulations automatically.
Quantum computation is analogous: get a certain vector in an n-dimensional Hilbert space, multiply it by these certain matrices in this specific order, and in the end you get a vector that corresponds to the "calculation" you did. This is absolutely only math, it has nothing to do with physics. What's exciting about this is that we expect to be able to build (real) machines that perform these operations way faster than any computer today can, because the specific operations with which be built our algorithm are exactly the things nature is capable of doing fast.
Of course, we only discovered this when we discovered quantum mechanics -- that's the relation to physics.
Re: (Score:2)
Ah, so your problem is not with physics, but with probability.
The thing is, there's no "abandoning of what is mathematically provable" in studying probabilistic algorithms. Computer science has studied for decades complexity classes like BPP [wikipedia.org] (bounded-error, probabilistic, polynomial time) which contains many useful algorithms, like for instance the Miller-Rabin primality test [wikipedia.org], which is used by OpenSSL to check if a number is prime.
So, whether you like it or not, it's already part of computer science.
Re: (Score:2)
First, go read the paper and then say that it doesn't contain "any kind of math."
Second, explain how shooting electrons through doped silicon crystal gates, and then measuring the voltage across the end terminals isn't a physical measurement being put on equal footing with "pure math." I think that it is, so deterministic algorithms aren't algorithms either by your definition.
Third, you're wrong. Lets take a 65536 bit long encryption key, the product of two very large primes. Lets factor them! The
Re: (Score:2)
Third, you're wrong. Lets take a 65536 bit long encryption key, the product of two very large primes. Lets factor them! The best known deterministic algorithm is going to take longer than the universe will last. But I bet it takes well under a second to check an answer for correctness! Let's say you have a quantum "algorithm" Q. You are correct, you can't prove Q(key) will produce the correct factorization. But you can CHECK it in only a second. So, run Q(key) until your check passes. There, that's an algorithm by your definition. You just used math to prove that if you run your quantum + classical hybrid computer on that input, it will always output the correct result.
Well, technically all we know is that it will never output an incorrect result. It could just continue to run until the sun goes nova, though of course this result is vanishingly improbable. Not that I disagree with your main point at all, I just like nitpicking!
That said, I think what the GP is concerned about is that with traditional algorithms you can, if you have enough time and paper, sit down and follow the same steps the computer does to do the calculation by hand. What he doesn't realize is that
Re: (Score:2)
If you do that, I think you're turning Computer Science from having been a branch of Mathematics into some completely different animal.
Algorithmic complexity theory has always depended on what you define a "computer" to be. An algorithm is an algorithm, but its space and time requirements require assumptions about what is executing the algorithm. That doesn't mean that the algorithm is non-mathematical. It means it uses different assumptions about what a computer is. That doesn't make quantum computing any less mathematical than classical computing. It just makes the mathematics different.
It is not mathematically proven or provable (beyond the proof that the most probable state is the desired result).
Non-deterministic algorithms have probabilisti
Re: (Score:2)
I don't see a big difference in mathematical content between "if you start with this state, and perform these transformations you will have this other state" and "if you start with this state, and perform some transformations, then you will have this other state".
Those are descriptions of classical and quantum algorithms, basically.
Now the only question is how to translate the final state into "the answer". In the case of classical algorithms the translation is usually very simple. In the quantum case, th
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
RSA security is dependent on the difficulty of factoring large primes, and this seems like it would reduce the time required to solve the problem considerably.
Factoring primes is the easiest freaking thing in the world.
Give me a prime, any prime. Ok, its factors are it and 1.
Perhaps someone more versed in mathematics can shed some light on this.
HTH HAND.
Re:Implications (Score:5, Funny)
Does this mean they can solve P=NP?
Yes: N=1.
Re: (Score:2)
Or if P=0, in which case N can be any number.
Re:Implications (Score:5, Funny)
Re: (Score:3, Informative)
You're both wrong. They're equivalent.
From wikipedia:
"A Turing machine can simulate these quantum computers, so such a quantum computer could never solve an undecidable problem like the halting problem. "
Re: (Score:2)
It's not a question of computability, it's a question of efficiency (time complexity).
Nondeterministic and deterministic Turing machines are also equivalent in the power of expression, but most likely not equal in terms of time complexity (otherwise P=NP).
Re: (Score:2)
A nitpick to avoid confusion: the above comment is talking about the efficiency of computers. So, you should read
Quantum computers solve problems in BQP.
as "Quantum computers efficiently solve problems in BQP" (that is, by the way, the definition of BQP [wikipedia.org]). Ditto for the following paragraph: "we do not know if a quantum computer can efficiently solve any problems a classical computer (with a random source) can't (solve efficiently)."
It also seems a little nonsensical to ask if a quantum computer can solve P=NP. "P=?NP" is a mathematical question, s
Re: (Score:2)
"so this is like asking if a quantum computer can prove Fermat's Last Theorem. "
Any hopes for a speed-up on Automated-Theorem-Proving?
Re: (Score:2)
Any hopes for a speed-up on Automated-Theorem-Proving?
Automated theorem proving (using propositional calculus) is NP-complete, so it's the same "no, to the best of our knowledge". (Using first order predicate calculus is even worse -- the algorithm is not even guaranteed to terminate).
Re:Polynomial Speedup (Score:5, Informative)
To rephrase the other response, log vs. polynomial is the same as polynomial vs. exponential. Moving from polynomial time to logarithmic time is an exponential speedup (just as is moving from exponential time to polynomial time is).
O(n^3) vs. O(log n)
->
O(exp(n^3)) vs. O(n)
Re: (Score:2)
To clarify:
O(n) vs. O(log n) -> O(2^n) vs. O(n)
Re:Bzzzt, wrong! (Score:4, Interesting)
if we are solving a linear system, we need A to be square
No. You can "solve" an under-determined system (i.e., reduce it). You can also "solve" an over-determined system via a least-squares fit. Both are common problems in scientific computing.