Friday, March 08, 2024

New Breakthrough Brings Matrix Multiplication Closer to Ideal

Good news! Exciting! Matrix multiplication is also at the very heart of machine learning & AI!

Very incremental progress is still better than no progress!

Will ML & AI find better methods in the future? You bet!

"... Now, three researchers — Ran Duan and Renfei Zhou of Tsinghua University and Hongxun Wu of the University of California, Berkeley — have taken a major step forward in attacking this perennial problem. Their new results, presented last November at the Foundations of Computer Science conference, stem from an unexpected new technique, Le Gall said. Although the improvement itself was relatively small, Le Gall called it “conceptually larger than other previous ones.” ...
The technique reveals a previously unknown and hence untapped source of potential improvements, and it has already borne fruit: A second paper, published in January, builds upon the first to show how matrix multiplication can be boosted even further. ...
In 1986, Strassen had another big breakthrough when he introduced what’s called the laser method for matrix multiplication. Strassen used it to establish an upper value for omega of 2.48. While the method is only one step in large matrix multiplications, it’s one of the most important because researchers have continued to improve upon it. ..."

From the abstract1:
"Fast matrix multiplication is one of the most fundamental problems in algorithm research. The exponent of the optimal time complexity of matrix multiplication is usually denoted by ω. This paper discusses new ideas for improving the laser method for fast matrix multiplication. We observe that the analysis of higher powers of the Coppersmith-Winograd tensor [Coppersmith & Winograd 1990] incurs a "combination loss", and we partially compensate for it using an asymmetric version of CW's hashing method. By analyzing the eighth power of the CW tensor, we give a new bound of ω<2.371866, which improves the previous best bound of ω<2.372860 [Alman & Vassilevska Williams 2020]. Our result breaks the lower bound of 2.3725 in [Ambainis, Filmus & Le Gall 2015] because of the new method for analyzing component (constituent) tensors."

From the abstract2:
"The main contribution of this paper is a new improved variant of the laser method for designing matrix multiplication algorithms. Building upon the recent techniques of [Duan, Wu, Zhou, FOCS 2023], the new method introduces several new ingredients that not only yield an improved bound on the matrix multiplication exponent ω, but also improve the known bounds on rectangular matrix multiplication by [Le Gall and Urrutia, SODA 2018]. In particular, the new bound on ω is ω ≤ 2.371552 (improved from ω ≤ 2.371866).
For the dual matrix multiplication exponent α defined as the largest α for which ω(1, α, 1) = 2, we obtain the improvement α ≥ 0.321334 (improved from α ≥ 0.31389). Similar improvements are obtained for various other exponents for multiplying rectangular matrices."

New Breakthrough Brings Matrix Multiplication Closer to Ideal | Quanta Magazine (Nice overview article) By eliminating a hidden inefficiency, computer scientists have come up with a new way to multiply large matrices that’s faster than ever.


New Bounds for Matrix Multiplication: from Alpha to Omega

Progress since 1970. Notice how progress slowed down since the 1990s






No comments: