The paper by Volker Strassen presents an algorithm for computing the product of two square matrices \(A\) and \(B\) of order \(n\) with fewer than \(4.7 \cdot n^{\log 7}\) arithmetic operations, significantly less than the usual \(2n^3\) operations. This algorithm also induces efficient methods for matrix inversion, solving systems of linear equations, and computing determinants. The author compares this result with Klyuyev and Kokovkin-Scherbak's finding that Gaussian elimination is optimal for row and column operations, and Winograd's modifications that reduce multiplications by trading them for additions and subtractions.
Strassen defines a recursive algorithm \(\alpha_{m, k}\) for multiplying matrices of order \(m2^k\), which reduces the number of multiplications and additions compared to the standard algorithm. By induction, he shows that \(\alpha_{m, k}\) computes the product with \(m^3 7^k\) multiplications and \((5 + m)m^2 7^k - 6(m 2^k)^2\) additions and subtractions. This leads to the conclusion that the product of two \(n \times n\) matrices can be computed with fewer than \(4.7 \cdot n^{\log 7}\) operations.
For matrix inversion, Strassen defines a recursive algorithm \(\beta_{m, k}\) that builds on the standard Gaussian elimination algorithm. The paper provides a detailed proof that these algorithms achieve the optimal complexity for matrix operations.The paper by Volker Strassen presents an algorithm for computing the product of two square matrices \(A\) and \(B\) of order \(n\) with fewer than \(4.7 \cdot n^{\log 7}\) arithmetic operations, significantly less than the usual \(2n^3\) operations. This algorithm also induces efficient methods for matrix inversion, solving systems of linear equations, and computing determinants. The author compares this result with Klyuyev and Kokovkin-Scherbak's finding that Gaussian elimination is optimal for row and column operations, and Winograd's modifications that reduce multiplications by trading them for additions and subtractions.
Strassen defines a recursive algorithm \(\alpha_{m, k}\) for multiplying matrices of order \(m2^k\), which reduces the number of multiplications and additions compared to the standard algorithm. By induction, he shows that \(\alpha_{m, k}\) computes the product with \(m^3 7^k\) multiplications and \((5 + m)m^2 7^k - 6(m 2^k)^2\) additions and subtractions. This leads to the conclusion that the product of two \(n \times n\) matrices can be computed with fewer than \(4.7 \cdot n^{\log 7}\) operations.
For matrix inversion, Strassen defines a recursive algorithm \(\beta_{m, k}\) that builds on the standard Gaussian elimination algorithm. The paper provides a detailed proof that these algorithms achieve the optimal complexity for matrix operations.