A New Algorithm Makes It Faster to Find the Shortest Paths
Briefly

A New Algorithm Makes It Faster to Find the Shortest Paths
"If you want to solve a tricky problem, it often helps to get organized. You might, for example, break the problem into pieces and tackle the easiest pieces first. But this kind of sorting has a cost. You may end up spending too much time putting the pieces in order. This dilemma is especially relevant to one of the most iconic problems in computer science: finding the shortest path from a specific starting point in a network to every other point."
"So if you want to design the fastest possible algorithm for the shortest-paths problem, it seems reasonable to start by finding the closest point, then the next-closest, and so on. But to do that, you need to repeatedly figure out which point is closest. You'll sort the points by distance as you go. There's a fundamental speed limit for any algorithm that follows this approach: You can't go any faster than the time it takes to sort."
Finding shortest paths from a given start to every other node requires computing minimum distances in a weighted graph. The intuitive fastest methods repeatedly select the nearest unsettled node and thus effectively sort nodes by distance, incurring a fundamental lower bound tied to sorting time. Researchers encountered this sorting barrier decades ago, which limited improvements in running time for such algorithms. A recently developed approach avoids repeated sorting and breaks the sorting barrier, producing an algorithm that runs faster than any method that relies on sorting. Graphs are modeled with nodes connected by weighted edges, where weights represent lengths or costs.
Read at WIRED
Unable to calculate read time
[
|
]