Forgot your password?
typodupeerror
Science Technology

New Method Is the Fastest Way To Find the Best Routes (quantamagazine.org) 51

Computer scientists at Tsinghua University and Stanford have developed an algorithm that surpasses a fundamental speed limit that has constrained network pathfinding calculations since 1984. The team's approach to the shortest-path problem -- finding optimal routes from one point to all others in a network -- runs faster than Dijkstra's 1956 algorithm and its improvements by avoiding the sorting process that created the decades-old computational barrier.

Led by Ran Duan at Tsinghua, the researchers combined clustering techniques with selective application of the Bellman-Ford algorithm to identify influential nodes without sorting all paths by distance. The algorithm divides graphs into layers and uses Bellman-Ford to locate key intersection points before calculating paths to other nodes. The technique works on both directed and undirected graphs with arbitrary weights, solving a problem that stymied researchers after partial breakthroughs in the late 1990s and early 2000s applied only to specific weight conditions.
This discussion has been archived. No new comments can be posted.

New Method Is the Fastest Way To Find the Best Routes

Comments Filter:
  • This is my jam! (Score:4, Interesting)

    by Anonymous Coward on Friday August 08, 2025 @12:11PM (#65575392)

    I love compsci and am thrilled to see that there is still progress being made on problems like this.
    There are other problems with solutions that are conjectured (but not proven) to be at their limit for efficiency. One such problem is the subset-sum problem, and of course there's the 3-SAT problem (NP complete.)

    • we can be friends!
      This was one of my most favourite ComSci side projects. I took a image bitmap read write code project from a graphics class and combined it with a shortest route algorithm from another class. Together I was able to scan a city map, use each pixel as a unit and calc the shortest route. It was so much fun.
    • Re:This is my jam! (Score:4, Interesting)

      by dvice ( 6309704 ) on Friday August 08, 2025 @05:06PM (#65576078)

      If that is your thing, I hope you have seen what AlphaEvolve has done "in 20% of cases, AlphaEvolve improved the previously best known solutions":
      https://deepmind.google/discov... [deepmind.google]

      I think we will be seeing a lot more of new discoveries made by AI in the near future.

  • will benefit from this research!
    • Maybe one day yet that salesman may be able to come home to his wife and kids.

    • Given that AFAIK there is no formal analytical solution for the TS problem, only ones that work darned well but are empirical, how much "better" does this approach do compared to the additional effort required? There's a benefit/cost issue.

      • by Anonymous Coward
        There's the brute-force solution.
      • by Sique ( 173459 )
        Actually, there is a formal analytical solution. It's just not polynomial. You can list all possible ways, which is O(n!), and then take the shortest one.
        • by Sique ( 173459 )
          (It's similar to other famous problems like Squaring the Circle. Yes, there is a solution, but it's not algebraic, as pi is non-algebraic.)
          • I don't think squaring the circle is a good example here, since the problem isn't so much a find the number problem, but a find the answer while restricting yourself to a few specific operations problem. So the non-algebraic solution doesn't count because it doesn't follow the rules. People knew what the answer was as a number that you can calculate for ages. They weren't sure if you could do it while following the arcane rules of ancient compass and straightedge constructions. It's turns out that the answe

            • by Sique ( 173459 )
              It's the same with the Traveling Salesman. Right now, there is no polynomial solution known, so it is not in P. On the other hand, a non-deterministic automaton can find the shortest path in polynomial time, hence it is in NP. If we could find a solution in polynomial time for a deterministic automaton, we would have proof for P=NP.
      • by Anonymous Coward

        Not sure what you mean by darned well and empirical, but I'll give my understanding:
        There are TSP solvers that don't merely work darned-well, they give an exact solution and use optimizations like early-culling of permutations that are known not to be in the solution. It's still O(n!) but in practice, with optimizations, this is better than an exponential-time algorithm that needs n^2 space for large graphs. (Consider 2000 cities, an algorithm that requires 2^2000 bits of memory is impossible to execute.)
        Th

    • Pick-and-place robots for circuit board assembly use path minimization algorithms.

      The robot moves over a 2D PCB surface, placing components. Cutting a few seconds off the P&P time can mean big savings.

      FPGA and ASIC layout engines also use these algorithms.

    • by allo ( 1728082 )

      No.

      TSP assumes you know the optimal distances between each pair of cities and then tries to find the perfect ordering such that the total distance is minimized. The TSP itself does not involve the steps to compute the optimal route between two cities.

  • By how much will electricity demand fall short of forecasts made using old constraints on algos?

  • Nowhere in the article does it say how much faster this is compared to the previous method. Are we talking second faster, milli-seconds faster, how much?

    • The algorithm is for a network of paths with arbitrary, i.e. unitless weights. It's not just for making your Google Maps journey faster!
    • Re:How much faster? (Score:5, Informative)

      by phantomfive ( 622387 ) on Friday August 08, 2025 @12:43PM (#65575458) Journal
      O(e*log^2/3 v) for the new one, and O(e + v*log v) for the old one (where e is edge and v is vertex).
      • The new algorithm is obviously only faster for a subset of shortest path problems. It's not always faster, only when there aren't many edges. My math isn't good enough to figure out for what values of e and v it is faster.
        • Re:How much faster? (Score:5, Informative)

          by phantomfive ( 622387 ) on Friday August 08, 2025 @01:24PM (#65575570) Journal
          I figured it out, when there are (roughly) more than e*log(e) edges for each vertex, then the new algorithm is faster. So for a graph with 3 vertices, anything with more than five edges goes faster with Dijkstra's (the old algorithm).

          For a graph with 13 vertices, anything with more than 50 edges is faster with Dijkstra's.

          For almost any algorithm, if you limit the input, you can create an algorithm that is faster on the subset of the input, but slower on other inputs. (my math might still be wrong don't publish it).
        • Okay I'll bite. I didn't spend a lot of time proof reading this and re-thinking it, but I think it works. Anyway, it's sketchy to say "for which" values of v and e, since the growth rate is asymptotic. But approximately for any particular implementation, we just set the growth rates equal.

          Taking the complexities from:

          https://arxiv.org/abs/2504.170... [arxiv.org]

          We get that the two equalize at:

          mlog(n)^{2/3}n = m + nlog n

          If we switch m for y and n for x, and solve for y, we can write y as a function of x as:

          y= \frac{y\lo

          • I accidentally switched to y's, that should have read x\logx / x\log(x)^{2/3} = log(x)^{1/3}.

            • Cool work. I think you also need to use log base 2 which is the common case in computer science, not log base 10 (or log base e)
              • I agree base 2 seems most natural in this context. Of course, if we do use different bases it just re-scales the growth by a constant factor via the base change formula, so I think it's probably just ignored for that reason (see first link). As for my notation, if you look under the section Using Common Logarithms in the OpenStax textbook section (see second link), you'll see a discussion on the meaning of "log" in the context of the common logarithm. Usually in math log is the common log, or base 10. But i

                • Yeah it's fine to leave it as log for writing purposes, but if you graph it in Desmos, you'll want to be more specific.
    • You don't typically measure this in seconds, you measure it in complexity. O(n), etc. The time it translates to will depend on the hardware, and usually also depends on how large the input is (ie how many nodes on the map in this case) so elapsed physical time wouldn't be a useful metric.

  • I wonder if this, or something similar is already used in place and route tools for chips and PCBs?

    • Optimum is the keyword. P&R settles for less than optimum to achieve results in reasonable times. The revolutions in P&R tend to be faster while still achieving good results.
  • by Dan East ( 318230 ) on Friday August 08, 2025 @12:32PM (#65575428) Journal

    It's not clear from the article because it doesn't have hard data, but it does say this:

    And if you chop up the graph in the right way, it runs slightly faster than the best version of Dijkstra’s algorithm. It’s considerably more intricate, relying on many pieces that need to fit together just right. But curiously, none of the pieces use fancy mathematics.

    So it is "slightly faster" than the well-known algorithm, and only in specific conditions when "you chop up the graph in the right way". The real question... is it SLOWER than Dijkstra’s in more cases to the point that on average it is actually slower?

    The news here is this isn't necessarily better (the portion about the implementation being "considerably more intricate" doesn't sound great) or faster, but it is a new method to find routes.

    • Why isn't the takeaway that an old limit has been revealed not to be a limit, just as budget constraints imagined in the 1980s have been shattered without the predicted hyperinflationary-failed-state consequences?

      • To go from computer science algorithm big O complexity analysis to (American?) politics and economics has gotta be a troll, right? Like so many Slashdot (and other internet) conversions are perpetually derailed onto irrelevant topics.
    • by alvinrod ( 889928 ) on Friday August 08, 2025 @02:52PM (#65575812)
      It seems like it might perform worse in cases where the graph's vertices are maximally connected to each other (basically a transitive closure) and the number of edges is effectively the square of the number of vertices. Then you have O(v^2 * log^2/3 v) vs. O(v^2 + v log v). I don't think that's too common though.
    • The implementation may be more complex, but that's exactly the kind of thing that ends up encapsulated in a library once the implementation settles down.

      What they've got here sounds likely to be an intermediate step. I'm reminded of Tim Peter's series of posts when he was developing what is now known as "Timsort". The implementation combines multiple sorting algorithms (similar to how this is combining multiple shortest path algorithms), and went through many iterations as it was tuned for various different

  • I look forward to 25 years of small and open-source players just using Bellman-Ford and Dijkstra instead because the RFC for this algorithm is unreadably opaque and the reference implementation is in FORTRAN.
  • Not a precisely perfect minimum cost, but an almost solution that's cheaper. Congrats they've found another method.

"Nature is very un-American. Nature never hurries." -- William George Jordan

Working...