Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Math

Optical Solution For an NP-Complete Problem? 232

6 writes to let us know that two optical researchers have proposed, as a thought experiment, a novel idea for solving the traveling salesman problem. From the abstract: We introduce an optical method based on white light interferometry in order to solve the well-known NP-complete traveling salesman problem. To our knowledge it is the first time that a method for the reduction of non-polynomial time to quadratic time has been proposed. We will show that this achievement is limited by the number of available photons for solving the problem. It will turn out that this number of photons is proportional to NN for a traveling salesman problem with N cities and that for large numbers of cities the method in practice therefore is limited by the signal-to-noise ratio. The proposed method is meant purely as a gedankenexperiment."
This discussion has been archived. No new comments can be posted.

Optical Solution For an NP-Complete Problem?

Comments Filter:
  • by Anonymous Coward on Thursday August 09, 2007 @12:18PM (#20171931)
    I think a couple of gaurd dogs and a shotgun are a good enough method to solve the travelling salesman problem.
    • Re: (Score:3, Funny)

      by spun ( 1352 )
      You don't even need to go that far. In fact, if you don't have an attractive teenage daughter you should be all right. If you do, you simply do not let the traveling salesman sleep anywhere on your property. Or so I've heard.
      • Re: (Score:3, Funny)

        by AndersOSU ( 873247 )
        If you have a hot daughter odds are you already had a traveling salesman problem, you just didn't know about it.
  • by LiquidCoooled ( 634315 ) on Thursday August 09, 2007 @12:19PM (#20171947) Homepage Journal
    In order that you can solve the article and produce feasible text in quadratic time you have to use a novel technique of installing a PDF reader.
  • Obligatory (Score:5, Funny)

    by Tackhead ( 54550 ) on Thursday August 09, 2007 @12:21PM (#20171973)
    "Nothing to see here. Please move along, but no, no, no, not over that bridge, you idiot! That's another one of those fucking pathological edge cases that invalidates what would have been an otherwise great TSP equivalence proof, and now I have to start all over again! Curse you, Konigsberg, why didn't the Brits and the Russians and the Germans finish you off when you were fair game!"

    (Did I mention how much I hated my Computability and Complexity courses when I was in college?)

  • Parallel computing (Score:5, Insightful)

    by iamacat ( 583406 ) on Thursday August 09, 2007 @12:22PM (#20171991)
    So effectively each photon is a CPU core and the running time is reduced by massive parallel computing rather than inherent reduction in complexity, which is (N^N)*(N^2).
    • by bateleur ( 814657 ) on Thursday August 09, 2007 @12:37PM (#20172195)
      The ironic thing being that this is precisely why we care about whether NP=P or not. Because without a polynomial time algorithm, large problems remain intractable even after you massively parallelize them!
    • by CaptainPatent ( 1087643 ) on Thursday August 09, 2007 @12:38PM (#20172209) Journal

      So effectively each photon is a CPU core and the running time is reduced by massive parallel computing rather than inherent reduction in complexity, which is (N^N)*(N^2).
      No. While the language of the paper is indeed rather thick, it seems they are using interference to get individual photons of light to traverse every pathway simultaneously. Even if I am only partially correct there, the photons in the experiment are only detected and are never being used as an instrument for computation.
      • Re: (Score:3, Insightful)

        it seems they are using interference to get individual photons of light to traverse every pathway simultaneously. Even if I am only partially correct there, the photons in the experiment are only detected and are never being used as an instrument for computation.

        I guess it depends what you mean by 'computation'. As the article itself says,

        So in total we will need [about] (5/64)*N^N photons.

        Unsurprisingly, they need an exponential amount of photons (and even N^N, whereas we know the problem is solvable u

    • by 2short ( 466733 )
      Exactly. Algorithmically, this is nothing. They're just saying, "What if we had hardware that could perform an unlimited number of calculations simultaneously, and thus render time-complexity of certain algorithms irrelevant?" To which I say: "That would be spiffy, but you don't have such hardware, and Heisenberg says you never will."
  • Some Reference info (Score:5, Informative)

    by Fox_1 ( 128616 ) on Thursday August 09, 2007 @12:23PM (#20172005)
    Heh, to give you a better idea of what the abstract is talking about:
    The Travelling Salesman Problem [wikipedia.org]
    and this doozy of a word : gedankenexperiment [wikipedia.org]
    • by ZOMFF ( 1011277 ) * on Thursday August 09, 2007 @12:39PM (#20172223) Homepage
      Badonkadonkexperiment [wikipedia.org] ???
    • Gedankenexperiment: When you want to use German, and can't come up with a reason, Gedankenexperiment is for YOU! (tm)
      • Yes, use a german translation of a danish expression in english when you already have a similar and perfectly usefull expression in english. I think it would be hard to find a more useless foreignword.
  • by LiquidCoooled ( 634315 ) on Thursday August 09, 2007 @12:25PM (#20172035) Homepage Journal
    So, to find out the shortest path for a travelling salesman you have to have a travelling Fibre fitter installing cables between all the cities?

    What is the optimum path the fibre fitter must take to lay all the cables and reduce his mileage?
    • Re: (Score:3, Informative)

      by evanbd ( 210358 )
      So, on a serious note, the fiber installer (who has to traverse each edge) is different from the salesman (who has to visit each node). The problem of the installer is the Chinese Postman Problem [wikipedia.org], and is actually *not* NP-complete. In fact, the current best approximate solutions to the TSP involve transforming it to the CPP, solving that, and then transforming it back. (Sorry, I'm not clear on the details.)
  • by jfengel ( 409917 ) on Thursday August 09, 2007 @12:26PM (#20172049) Homepage Journal
    This solves a nondeterministic-polynomial algorithm by using a very large number of parallel computations to simulate nondeterminism.

    This was proposed some years ago for DNA computers as well, until somebody figured out that it would take a mass of DNA the size of the earth to figure out a non-trivial problem. So this is NOT the first time somebody has proposed a method for reducing NP problems to polynomial time, though this mechanism is novel as far as I know.

    Photons are a lot smaller than DNA. N^N photons seems much more feasible. But even so... once N=100, 100^100 photons is way more than we can handle.
    • by PhysicsPhil ( 880677 ) on Thursday August 09, 2007 @12:42PM (#20172269)

      This solves a nondeterministic-polynomial algorithm by using a very large number of parallel computations to simulate nondeterminism.

      This was proposed some years ago for DNA computers as well, until somebody figured out that it would take a mass of DNA the size of the earth to figure out a non-trivial problem. So this is NOT the first time somebody has proposed a method for reducing NP problems to polynomial time, though this mechanism is novel as far as I know.

      I'm not sure this comparison is correct. The use of DNA just increased the computational power available to the problem, but didn't change the fundamental methods of calculation (i.e. one step at a time). The DNA computer didn't make the NP problem go away, it just threw more power at it.

      This uses the interference of the light within an optical network to perform the calculation. The "operation", such as it is, relates to physically constructing the network rather than the number of photons required. In a very tenuous way, this is similar to a quantum computer, where multiple calculations are performed simultaneously. Of course it's not a quantum computer, but it does appear to be a polynomial algorithm.

      • but it does appear to be a polynomial algorithm
        You could say that, but since it requires an exponential number (e.g., N^N) photons, it's not clear that this is really an improvement.

        It'd be a little like having an polynomial time algorithm that required exponential space. Interesting oddity, but unfortunately not useful in itself.

    • by mdmkolbe ( 944892 ) on Thursday August 09, 2007 @12:47PM (#20172337)
      From the last line of the abstract, "The proposed method is meant purely as a gedankenexperiment."

      Translation, "We know this wont ever work; we just think it's cool."

      Even better is in section five where they cite Wikipedia for the definition of a quantum computer.
    • by SeanMon ( 929653 ) on Thursday August 09, 2007 @01:17PM (#20172747) Homepage Journal
      A CO2 laser at 500 kW generating a beam of 10 micrometer wavelength produces about about 2.52 x 10^25 photons per second. It will only take 1.25 x 10^167 years to generate 100^100 photons.
      Just a bit more than we can handle.
      • Re: (Score:2, Funny)

        Or to put it a little more excitingly, solving a 26 step problem with 12um photons will take somewhere in the region of 25 megatons.

        Which means you would probably have to be pretty desperate for sales.

      • Good thing they were talking about a single photon being split and traveling all possible paths simultaneously, and measuring the interference, then.

        Still, they say that for larger problem sets the SNR makes detection and filtering too difficult. Larger problem sets are precisely where one worries the most about computational complexity, of course. So it's still, at least for now, just a neat trick.
        • Actually, I misread the article. It's not one photon. It isn't necessarily n**n, either. It's proportional to n**n. To be precise, it looks like 5/64 * n**n.

          That's still a lot, but it's far fewer than n**n.

          Still, it does seem that it limits it to the "neat trick" category.
  • by TheRequiem13 ( 978749 ) <therequiem@gBOYSENmail.com minus berry> on Thursday August 09, 2007 @12:26PM (#20172057)

    ...gedankenexperiment...

    Gasundheit!
  • by a.d.venturer ( 107354 ) on Thursday August 09, 2007 @12:28PM (#20172085) Homepage Journal
    As pointed out here [antipope.org] "Apparently the method is polynomial in time, but exponential in energy ..."

    to which Charles Stross replies "Ah, so that's what the short duration GRBs are!"

    Fnord.
    • by spun ( 1352 ) <[moc.oohay] [ta] [yranoituloverevol]> on Thursday August 09, 2007 @12:50PM (#20172379) Journal
      That joke has too high of a Dennis Miller ratio [snpp.com] even for Slashdot.
      • by Boronx ( 228853 )
        That's only because he abbreviated GRB.
        • by spun ( 1352 )
          Sad to say, although I know what a GRB is I had to look up who the heck Charles Stross is. He sounds like my kind of author, but it makes me wonder: are all good sci-fi authors from the UK these days? Iain Banks, Ken McLeod, Stephen Baxter, Peter Hamilton, Ian McDonald: the list goes on and on.
          • by Fweeky ( 41046 )

            I had to look up who the heck Charles Stross is. He sounds like my kind of author

            Here's [accelerando.org] one of his books if you'd like to check.

            are all good sci-fi authors from the UK these days?

            No. Peter Watts [rifters.com] is Canadian. You can check [rifters.com] to see if he's decent too (people are going nuts over Blindsight atm).

            Greg Egan [netspace.net.au] is Australian, and there's plenty of supporting information of his existing work and a few of his short stories on there if you're not familiar with him.

  • Is that something like watching at the blinkenlights?

    • No, it's watching at die Blinkenlichten, silly!
      • Gedankenexperiment ? Is that something like watching at the blinkenlights?
        No, it's watching at die Blinkenlichten, silly! ...
        Uhm. I wonder how many people outside Slashdot would understand this exchange :o)
  • We do not apriori know that the laws of physics cannot be (ab)used to cause a computation to happen in a way which is strictly better than the way a Turing machine (read 'pretty much any computer you can think of') works. Though this apparatus requires a large number of photons it is an exciting result towards what could be a real paradigm shift in computing. For similar reasons quantum computing is interesting to us, but it too has its drawbacks. Alternatively one could hope for an (IMHO unlikely) proof of
    • You are confused about the definition of a Turing machine. A Turing machine says nothing about computational efficiency. Being able to solve NP complete problems in polynomial time will not give you an oracle. That is, it will still not be able to solve the halting problem, for instance.

      • Re: (Score:3, Informative)

        by asuffield ( 111848 )

        You are confused about the definition of a Turing machine. A Turing machine says nothing about computational efficiency.

        And yet, P and NP are defined in terms of a Turing machine. Herein lies the GPs point: it is taken as a given that the Turing machine is capable of computing any effectively computable function, but it is an open question as to whether we can build a different kind of machine which would be able to solve NP problems in polynomial time. By definition, the non-deterministic Turing machine so

  • by PhysicsPhil ( 880677 ) on Thursday August 09, 2007 @12:35PM (#20172165)
    I browsed through the article, and here is my understanding of what they are doing.

    The experimenters are constructing the map of the various cities using optical fibres. Each city represents a junction in the optical fibre network, and each fibre has a length proportional to the weight of the edge joining two cities in the abstract problem.

    Once the fibre network is constructed, they shine a white light source into the network. As the light propagates through the system, it splits at each junction (i.e. city). As a consequence, the optical signal is able to sample all possible paths through the network simultaneously. The entire optical network is put on one arm of an interferometer, and the length of the other arm (the reference arm) is adjusted. Starting from a known lower bound on the city length, the length of the reference arm is increased until the reference signal interferes with the output signal from the optical network. At that point, they have the length of the shortest path, and apparently can do some kind of reconstruction to get the actual path from there (didn't quite follow how that happened).

    The claimed reduction of an NP problem to quadratic comes from the setup of the experimental apparatus. An "operation" consists of connecting one of the N cities to another of the N cities. For an average collection of cities, there will be a number of roads/connections proportional to N^2. Of course the operation is awfully slow, but it's a thought experiment more than anything.
    • by gr8_phk ( 621180 )
      They claim n^2 time complexity. Then they point out the number of photons needed is n^n. There are physical limits to photon production rates. I would say they're still looking at an n^n problem unless they can produce an infinite number of photons instantly, and that would damage the equipment. It's an interesting method, but it doesn't actually improve the time complexity of the problem as they claim.
      • by SamP2 ( 1097897 ) on Thursday August 09, 2007 @01:47PM (#20173107)
        >I would say they're still looking at an n^n problem unless they can produce an infinite number of photons instantly, and that would damage the equipment

        If you can produce an infinite number of photons instantly than I don't think you'd be worried about any kind of equipment.

        For starters, try producing an infinite number of photons non-instantly (in a finite period of time), OR try to produce a finite number of photons instantly. Equipment will be the least of your problems.
    • It's an analog computer solution to the problem; note that analog computers are not subject to limits based on theorems relating to Turing machines (and related algorithmic computational devices). However, the resources required still scale exponentially; the computation (if you want to call it that) is done by photons, and the number of photons required scales as N^N. Essentially, they are trading time for computational resources, where in this case the computational resource is "photons".
    • by Bender0x7D1 ( 536254 ) on Thursday August 09, 2007 @01:03PM (#20172557)

      One important part of any solution is the amount of time/cycles it takes to encode the problem for use in your algorithm.

      For example, I can't claim that my algorithm can factor a number in O(1) if I require that the input for the algorithm is a vector of the prime factors for the desired number. Yes, my part of the algorithm is O(1), but to take a number and convert it to the format for my algorithm is not O(1), meaning the overall performance can't be considered O(1).

      In summary, the time/cycles to set up the problem counts.

      • by Kupek ( 75469 )
        Well, you can, it's just not useful. If I state a problem is O(something), I get to determine what operation I counted. But if we're talking about sorting, and I choose any operation other than comparing two elements, then it's not a useful analysis.
  • by imasu ( 1008081 ) on Thursday August 09, 2007 @12:36PM (#20172179)
    First off, NP does not mean "non-polynomial", it means "nondeterministically polynomial". Which means, the set of problems that can be solved in polynomial time on a nondeterministic turing machine. They are not reducing an NP problem to P here, which would require that their algorithm be executable on a deterministic turing machine in polynomial time. Rather, they are saying that if they effectively simulate a limited nondeterministic turing machine by increasing the number of compute units (in this case, photons) to effectively infinite numbers, then there is a polynomial solution. Which, since the travelling salesman problem is known to be in NP, is not surprising. Or am I misreading this? What IS cool is that they have found a way to actually effectively simulate a subset of a nondeterministic turing machine.
    • by The Night Watchman ( 170430 ) <[moc.liamg] [ta] [attorams]> on Thursday August 09, 2007 @01:00PM (#20172521)

      Rather, they are saying that if they effectively simulate a limited nondeterministic turing machine by increasing the number of compute units (in this case, photons) to effectively infinite numbers, then there is a polynomial solution. Which, since the travelling salesman problem is known to be in NP, is not surprising. Or am I misreading this?
      That sounds right to me. I don't like how they're claiming that "the complexity of the traveling salesman problem can be dramatically reduced from N! to N^2 by optical means." They're not reducing the complexity of the problem at all. What they're doing is designing a parallel processing system that can approximate a nondeterministic Turing machine, thereby allowing the problem to be solved in polynomial time. This does nothing to indicate that P=NP. While they do make that point clear, I still take issue with their claim that they're doing anything at all to the complexity of the original problem.
      • by teknopurge ( 199509 ) on Thursday August 09, 2007 @01:06PM (#20172607) Homepage
        Please mod parent and GP up. My thesis was on NP-Complete problems and combinatorial optimization and as soon as I saw "photons" I knew this was bunk. It does not matter what instrument you use: CPU core, Network Node, DNA, Molecule, Q-bit, electron-spin, etc. They are all constructs to illustrate problems. The entire point of NP-complete problems is that they cannot be solved and verified in reasonable time using anything that has a physical limitation: a clock speed, a limited number-of-sides, a finite number of nodes in a graph, finite degrees of spin, etc.

        IMO, the only way to reduce NP-Complete problems is using something like quantum entanglement or another similar characteristic that is not bounded by classical physics.
        • The Wikipedia article says that physicist David Deutsch [wikipedia.org] says that Shor's Algorithm [wikipedia.org], which solves an NP-whatever problem in (log n)^3 time, works because it draws computational power from another universe.

          Crank or no?
          • Re: (Score:3, Informative)

            by cpeikert ( 9457 )
            Shor's Algorithm, which solves an NP-whatever problem in (log n)^3 time, works because it draws computational power from another universe.

            Whoah, whoah. Shor's Algorithm solves the factoring problem, which is almost certainly NOT NP-complete. (If it were, then NP would equal coNP, which would be almost as surprising as if NP equalled P.)
        • by flonker ( 526111 )
          Some time back, I read about the spaghetti sort [wikipedia.org], where you sort spaghetti by length in constant time. I set my mind to trying to discover a similar "solution" to TSP, mostly as something to do while waiting at the DMV, grocery store, etc.

          I came up with the string solution.

          You cut pieces of string equal to the distance between the cities, and tie each piece of string to two rings representing the cities, and label each ring with a city's name. To solve the problem for any pair of cities, you pick up the ri
          • Which ever string has the most tension (is at the top, whatever), is the best path to take.

            That doesn't solve the TSP -- it solves the shortest path problem. The TSP requires the salesman to visit every city; your method will only visit a subset of the cities. As a simple thought experiment, consider that the string method cannot give a path that starts and ends in the same city. The shortest path problem is already solvable in O(N^2), as a classic tree/graph searching problem.

        • Re: (Score:3, Interesting)

          by asuffield ( 111848 )

          IMO, the only way to reduce NP-Complete problems is using something like quantum entanglement or another similar characteristic that is not bounded by classical physics.

          Photons are not bounded by classical physics, and their behaviour can only be explained by quantum physics. Whether or not this behaviour can be exploited to perform computation more efficiently than a Turing machine is unknown (and will likely remain that way until we untangle the problem of how quantum physics really works). We still do no

  • by n01 ( 693310 ) on Thursday August 09, 2007 @12:39PM (#20172219)
    The paper says that the path the photons have to travel for a TSP with N cities is
    N*d + a*(2^N+1)
    Since the speed of light is finite, the algorithm still takes O(2^N) i.e. exponential time to complete.
    • Since the speed of light is finite, the algorithm still takes O(2^N) i.e. exponential time to complete.

      Yes, but at least in theory the paths can be made almost infinitely short. At some point the energy density of the photons will overwhelm spacetime and form a black hole, however :-)

  • by frankie ( 91710 ) on Thursday August 09, 2007 @12:43PM (#20172275) Journal
    To solve a 50-point traveling salesman using their algorithm would require on the order of 50^50 photons (about 10^85). For comparison, the Sun emits roughly 10^45 photons per second. Somehow I don't think their system is going to scale very well.
  • P= NP (Score:3, Funny)

    by naoursla ( 99850 ) on Thursday August 09, 2007 @12:52PM (#20172417) Homepage Journal
    P is equal to NP because processing speed is increasing expoentially. Each year, the amount of processing you can do doubles.

    The researchers are just using an expoential number of photons to aid in the processing.
    • The parent post is woefully incorrect (just read a wikipedia article on NP completeness). But it is not a troll.

      Please, mods use some sense in moderating.
      • by naoursla ( 99850 )
        It was intended more as a thought provoking joke, really.

        And it is true that if you have expoentially increasing computational resources you can solve NP problems in polynomial time.
        • And it is true that if you have expoentially increasing computational resources you can solve NP problems in polynomial time.

          That would be true, but we don't have exponentially increasing computational resources. Moore's Law, for instance, describes geometric increase.

  • The submitter should have gedanken about their server before submitting this story.
  • This reminds me of a clever optical sorting algorithm I ran across a paper on in recent years (see http://www.cs.auckland.ac.nz/CDMTCS//researchrepor ts/244dominik.html [auckland.ac.nz]). Again, a clever thought experiment - not sure how feasible it will be anytime soon to actually use though.
  • by p3d0 ( 42270 ) on Thursday August 09, 2007 @01:07PM (#20172619)

    To our knowledge it is the first time that a method for the reduction of non-polynomial time to quadratic time has been proposed.
    This is far from the first time that someone has claimed to solve an NP-complete problem in P time by limiting the size of the problem. It's not that hard to design a circuit that solves TSP in polynomial time if you get to put a limit on the number of edges.

    Also, "NP" doesn't stand for "non-polynomial". There is no such thing as "non-polynomial time". It's Nondeterministic Polynomial time.

    These guys may know their optics, but they're amateurs in complexity theory. This is most painfully obvious in their concluding sentence:

    Since for practical (non-pathological) problems by purely electronic means very good solutions to even large size problems can be found, our proposed method is not meant to solve real-world traveling salesman problems but rather as a gedankenexperiment to show how photons and the laws of physics can considerably reduce the computational complexity of difficult mathematical problems.
    It does no such thing. All it does is parallelize the computation.
    • by wurp ( 51446 )
      I agree with everything you have to say, with one nitpicking exception: non-polynomial time seems a reasonable term to use. An algorithm that is O(N^N) takes time that is not polynomial in N, hence it is non-polynomial time.

      Non-polynomial wouldn't mean the same thing as NP... You could put together an algorithm that is non-polynomial on a non-deterministic computer, too, which would be non-polynomial and not NP. It would be harder than NP.
      • Re: (Score:2, Informative)

        by p3d0 ( 42270 )

        I agree with everything you have to say, with one nitpicking exception: non-polynomial time seems a reasonable term to use. An algorithm that is O(N^N) takes time that is not polynomial in N, hence it is non-polynomial time.

        I disagree. They're not talking about an algorithm here; they're talking about the Traveling Salesman Problem. They called the TSP "non-polynomial", and that is by no means certain. If you could prove that the TSP has no polynomial-time solution, you'd get the Turing award.

  • by karlandtanya ( 601084 ) on Thursday August 09, 2007 @01:12PM (#20172685)
    Some time ago.


    Solution involved a Farmer's daughter, which she apparently was.

  • Increasing Orders (Score:2, Interesting)

    by Doc Ruby ( 173196 )
    "N+X" is called "addition": additive increase. "N+N" is called multiplication (2N): geometric increase, as is "N*X". "N*N" is called exponential (NX). What is "NN" called? And is there a higher order of increase?

    And what are all those kinds of operations called?
    • N*N is called a quadratic and it's polynomial, N^N (or more so) c^N where c is constant is called exponential when c > 1.
      • Note that realistically, pretty much any problem has an exponential solution, so going beyond this isn't very interesting. There's a few problems related to the halting problem for which we know we simply can't provide a running time, and a couple other classes of problems that we don't know how to analyze, but they're not what I'd call real world problems. These facts, have, naturally, not stopped mathematicians from spending time talking about how to represent very large numbers; the Ackerman function is
    • by Skuto ( 171945 )
    • by dargaud ( 518470 )

      Is there a higher order of increase? And what are all those kinds of operations called?

      Yes, there are plenty of functions which grow faster than an exponential. Some of the most well known (and easier to understand) include the Knuth up-arrow [wikipedia.org], the hyper operator [wikipedia.org], the Conway chained arrow [wikipedia.org]...

      What's interesting to note is that some of those functions like the busy beaver [wikipedia.org] (!), although well defined and somewhat simple, cannot even be computed. We only know that they are BIG !

    • Something faster than e^(e^x)) is super- or hyperexponential. The Ackermann function and tetration have already been mentioned. Wikipedia is a good resource for more info on large numbers [wikipedia.org]. A() and tetration can be represented by Knuth arrows, which provide a compact way to describe recursive exponentiation-like operations. They can describe functions that grow at enormous rates:

      2^4 = 16; 2^^4 = 2^(2^(2^2)) = 2^(2^4) = 2^16 = 65536 (2^^n is ackermann(4, n+3)-3 )
      2^^^4 = 2^^(2^^(2^^2)) = 2^^(2^^(4)) = 2^^6
  • by Anonymous Coward
    opticsexpress.com
    I guess they were going for "optics express"
    I of course read it as "optic sex press"
    and there's no way you're getting me to click that link at work!
  • In other news, Computer Science researchers discover that O(n^n) problems reduce to O(1) given the availability of n^n comptuers working in parallel.

    I would note, however, that a more useful result does exist: many O(n log n) problems reduce to O(n) given the availability of log n processors. As log n is generally small this requires only a trivial application of parallelism. Merge sort, one of the staples of database engines, is a good example.
  • My favorite analog computing paper comes from Steiglitz et al see www.cs.princeton.edu/~ken [google.com]. I like the way to use car differentials to solve 3SAT. Pretty cool. It only requires O(n) differential equations where n is the number of clauses in the 3SAT equation. But I've heard that the result still requires an exponential amount of precision. At least according to some. Maybe an engineer could hack through what baffles the theory heads.
  • by natoochtoniket ( 763630 ) on Thursday August 09, 2007 @01:47PM (#20173113)

    Actually, the running time is not reduced by the algorithm disclosed in the article. The disclosed algorithm has running time at least $O(2^N)$. The algorithm uses photons as parallel processors, but the shortest running time for any of those photons is $O(2^N)$. This is because the algorithm uses a time delay in the apparatus representing city $I$ equal to $\alpha 2^I$, where $\alpha$ is strictly longer than the longest city-to-city delay in the problem. In city $N$, the time delay is $\alpha 2^N$. The algorithm uses these time delays to differentiate between valid solutions and erroneous solutions to the TSP problem. For every valid solution, the photon representing that solution must pass through each city $i$, and must incur the corresponding delay. Hence, every valid solution is found only after time at least $\sum_{i=1}^N \alpha 2^i)$ or $O(2^N)$.

    The article approaches a problem that Optics Express readers might not normally consider. And, it may represent a new application of optics technology (that is out of my field). But, the use of physical models to approach $NP$ problems is not new. And, the algorithm is not faster than other known algorithms for the same problem.

  • someone else already did it, much more simply, in 2002.

    http://www.rsc.org/publishing/journals/LC/article. asp?doi=b200589a [rsc.org]
  • This is where you read a summary like that one, and you want to gedanken yourself in the head with a hammer afterwards for trying to understand it.
  • Um, isn't this as sucky solution, as at each city the reflections are going to jack up the background noise level? Even with a good match, you're unlikely to get less than 10% reflection and crosstalk at each junction. After just a FEW hops the reflection noise is going to mask any desired signal.

    Also I don't see (from the abstract) how they're going to extract the desired shortest answer from all the wrong answers and reflections.

  • To our knowledge it is the first time that a method for the reduction of non-polynomial[sic] time to quadratic time has been proposed.

    Let n = number of cities.

    1. Cut strings to length of path between cities: O(n^2))
    2. Tie together ends of links that meet at same city: O(n^2))
    3. Grab start and destination endpoints in each hand, and pull taut: O(1)
    4. Mark route along links that have tension in them: O(n)

    Overall complexity: O(n^2)

  • by Michael Woodhams ( 112247 ) on Thursday August 09, 2007 @04:28PM (#20175193) Journal
    In their setup, each city has a delay line (i.e. optical fibre.) Each new city you add has to have a delay line twice as long as the previous one you added. The required amount of fibre grows exponentially with the number of cities.

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...