Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Math

Optical Solution For an NP-Complete Problem? 232

6 writes to let us know that two optical researchers have proposed, as a thought experiment, a novel idea for solving the traveling salesman problem. From the abstract: We introduce an optical method based on white light interferometry in order to solve the well-known NP-complete traveling salesman problem. To our knowledge it is the first time that a method for the reduction of non-polynomial time to quadratic time has been proposed. We will show that this achievement is limited by the number of available photons for solving the problem. It will turn out that this number of photons is proportional to NN for a traveling salesman problem with N cities and that for large numbers of cities the method in practice therefore is limited by the signal-to-noise ratio. The proposed method is meant purely as a gedankenexperiment."
This discussion has been archived. No new comments can be posted.

Optical Solution For an NP-Complete Problem?

Comments Filter:
  • Parallel computing (Score:5, Insightful)

    by iamacat ( 583406 ) on Thursday August 09, 2007 @01:22PM (#20171991)
    So effectively each photon is a CPU core and the running time is reduced by massive parallel computing rather than inherent reduction in complexity, which is (N^N)*(N^2).
  • by Anonymous Coward on Thursday August 09, 2007 @01:32PM (#20172125)
    This is /. not "CNN News", I think it's reasonable to expect readers here to know what The TSP is.
  • by TheEmptySet ( 1060334 ) on Thursday August 09, 2007 @01:32PM (#20172137)
    We do not apriori know that the laws of physics cannot be (ab)used to cause a computation to happen in a way which is strictly better than the way a Turing machine (read 'pretty much any computer you can think of') works. Though this apparatus requires a large number of photons it is an exciting result towards what could be a real paradigm shift in computing. For similar reasons quantum computing is interesting to us, but it too has its drawbacks. Alternatively one could hope for an (IMHO unlikely) proof of P=NP, which would say that a Turing maching can after all achieve similar efficiency.
  • by bateleur ( 814657 ) on Thursday August 09, 2007 @01:37PM (#20172195)
    The ironic thing being that this is precisely why we care about whether NP=P or not. Because without a polynomial time algorithm, large problems remain intractable even after you massively parallelize them!
  • by n01 ( 693310 ) on Thursday August 09, 2007 @01:39PM (#20172219)
    The paper says that the path the photons have to travel for a TSP with N cities is
    N*d + a*(2^N+1)
    Since the speed of light is finite, the algorithm still takes O(2^N) i.e. exponential time to complete.
  • by frankie ( 91710 ) on Thursday August 09, 2007 @01:43PM (#20172275) Journal
    To solve a 50-point traveling salesman using their algorithm would require on the order of 50^50 photons (about 10^85). For comparison, the Sun emits roughly 10^45 photons per second. Somehow I don't think their system is going to scale very well.
  • by morgan_greywolf ( 835522 ) on Thursday August 09, 2007 @01:43PM (#20172277) Homepage Journal
    Well, even if you knew what the TSP is, it might be, depending on your chosen profession, that you may not recall of the specifics regarding the TSP, such as the formulae used, what sorts of algorithms may be used to solve the problem, etc. Not all of us have jobs where we use our math/computational science/computer science skills on a day-to-day basis.

  • by mdmkolbe ( 944892 ) on Thursday August 09, 2007 @01:47PM (#20172337)
    From the last line of the abstract, "The proposed method is meant purely as a gedankenexperiment."

    Translation, "We know this wont ever work; we just think it's cool."

    Even better is in section five where they cite Wikipedia for the definition of a quantum computer.
  • by gr8_phk ( 621180 ) on Thursday August 09, 2007 @01:50PM (#20172385)
    They claim n^2 time complexity. Then they point out the number of photons needed is n^n. There are physical limits to photon production rates. I would say they're still looking at an n^n problem unless they can produce an infinite number of photons instantly, and that would damage the equipment. It's an interesting method, but it doesn't actually improve the time complexity of the problem as they claim.
  • by Geoffrey.landis ( 926948 ) on Thursday August 09, 2007 @01:51PM (#20172393) Homepage
    It's an analog computer solution to the problem; note that analog computers are not subject to limits based on theorems relating to Turing machines (and related algorithmic computational devices). However, the resources required still scale exponentially; the computation (if you want to call it that) is done by photons, and the number of photons required scales as N^N. Essentially, they are trading time for computational resources, where in this case the computational resource is "photons".
  • by Cyclon ( 900781 ) on Thursday August 09, 2007 @02:00PM (#20172511)
    The proposed method is meant purely as a gedankenexperiment.

    Guess you missed this part.
  • by Anonymous Coward on Thursday August 09, 2007 @02:32PM (#20172939)
    opticsexpress.com
    I guess they were going for "optics express"
    I of course read it as "optic sex press"
    and there's no way you're getting me to click that link at work!
  • by natoochtoniket ( 763630 ) on Thursday August 09, 2007 @02:47PM (#20173113)

    Actually, the running time is not reduced by the algorithm disclosed in the article. The disclosed algorithm has running time at least $O(2^N)$. The algorithm uses photons as parallel processors, but the shortest running time for any of those photons is $O(2^N)$. This is because the algorithm uses a time delay in the apparatus representing city $I$ equal to $\alpha 2^I$, where $\alpha$ is strictly longer than the longest city-to-city delay in the problem. In city $N$, the time delay is $\alpha 2^N$. The algorithm uses these time delays to differentiate between valid solutions and erroneous solutions to the TSP problem. For every valid solution, the photon representing that solution must pass through each city $i$, and must incur the corresponding delay. Hence, every valid solution is found only after time at least $\sum_{i=1}^N \alpha 2^i)$ or $O(2^N)$.

    The article approaches a problem that Optics Express readers might not normally consider. And, it may represent a new application of optics technology (that is out of my field). But, the use of physical models to approach $NP$ problems is not new. And, the algorithm is not faster than other known algorithms for the same problem.

  • by kripkenstein ( 913150 ) on Thursday August 09, 2007 @03:14PM (#20173481) Homepage

    it seems they are using interference to get individual photons of light to traverse every pathway simultaneously. Even if I am only partially correct there, the photons in the experiment are only detected and are never being used as an instrument for computation.
    I guess it depends what you mean by 'computation'. As the article itself says,

    So in total we will need [about] (5/64)*N^N photons.
    Unsurprisingly, they need an exponential amount of photons (and even N^N, whereas we know the problem is solvable using 2^N or so). In effect, each photon is used to perform one 'calculation', in some sense. Or at least that is what I understand from TFA, IANA Physicist.
  • by Michael Woodhams ( 112247 ) on Thursday August 09, 2007 @05:28PM (#20175193) Journal
    In their setup, each city has a delay line (i.e. optical fibre.) Each new city you add has to have a delay line twice as long as the previous one you added. The required amount of fibre grows exponentially with the number of cities.

What is research but a blind date with knowledge? -- Will Harvey

Working...