Department of Mathematics,
University of California San Diego
****************************
Math 278C: Optimization and Data Science Seminar
Jinjie Zhang
UCSD
Grothendieck Constant is Norm of Strassen Matrix Multiplication Tensor
Abstract:
Grothendieck's inequality guarantees that a certain discrete optimization problem-optimizing a bilinear form over +1, -1 inputs—is equivalent up to a constant, called Grothendieck's constant, to a continuous optimization problem -- optimizing the same bilinear form over unit vectors in a Hilbert space. This is important for convex relaxation, because the former contains NP-hard problems such as max-cut, whereas the latter is a semidefinite program and can be solved in polynomial time. A world apart from convex relaxation is algebraic computational complexity, where it is well-known that the exponent of matrix multiplication is exactly the sharp lower bound for the (log of) tensor rank of the Strassen matrix multiplication tensor. We show that Grothendieck's constant is the sharp upper bound on a tensor norm of Strassen matrix multiplication tensor. Hence these two important quantities from disparate areas of theoretical computer science—Strassen's exponent of matrix multiplication and Grothendieck's constant—are just different measures of the same underlying tensor. This allows us to rewrite Grothendieck's inequality as a norm inequality for a 3-tensor, which in turn gives a family of natural generalized inequalities. We show that these are locally sharp and prove that Grothendieck's inequality is unique.
Host: Jiawang Nie
October 24, 2018
3:00 PM
AP&M 5829
****************************