The PhysOrg.com site has a report from Carnegie-Mellon University [CMU] on a new algorithm, developed by CMU computer scientists, that promises a dramatic speed increase in solving a common class of large linear equation systems.
Computer scientists at Carnegie Mellon University have devised an innovative and elegantly concise algorithm that can efficiently solve systems of linear equations that are critical to such important computer applications as image processing, logistics and scheduling problems, and recommendation systems.
(The original CMU press release, which has a few more Web links, is here.)
You probably learned to solve simple linear systems in school. For example, if we consider the following system, which has two equations in two unknowns,
2 x + y = 10
x + 3 y = 15
a little manipulation will give us the answer x=3, y=4. Small systems like this are easy, but real applications may in some cases have thousands or even millions of equations and variables.
One of the best general-purpose algorithms for solving linear systems is Gaussian elimination; its running time is O(n3). That is, for a problem with ‘n’ equations, the running time is proportional to n3 as n gets large. Although the new algorithm is specialized, for a class of systems called symmetric diagonally dominant [SDD], it is much faster, with a running time O(n [log(n)]2). This is an important result, since SDD systems turn up in many different application areas. For example, the movie recommendation system used by Netflix uses an SDD system, as do many applications in image processing and engineering. There are other algorithms that are faster than Gaussian elimination, but none as fast as this one.
The team’s approach to solving SDD systems is to first solve a simplified system that can be done rapidly and serve as a “preconditioner” to guide iterative steps to an ultimate solution. To construct the preconditioner, the team uses new ideas from spectral graph theory, such as spanning tree and random sampling.