##### Department of Mathematics,

University of California San Diego

****************************

### Math 278B - Mathematics for Information and Data Science

## Jinjie Zhang

#### UCSD

## Grothendieck Constant, optimization and Strassen Matrix Multiplication Tensor

##### Abstract:

Grothendieck's inequality guarantees that a certain discrete optimization problem-optimizing a bilinear form over +1, -1 inputsis equivalent up to a constant, called Grothendieck's constant, to a continuous optimization problem -- optimizing the same bilinear form over unit vectors in a Hilbert space. This is important for convex relaxation, because the former contains NP-hard problems such as max-cut, whereas the latter is a semidefinite program and can be solved in polynomial time. A world apart from convex relaxation is algebraic computational complexity, where it is well-known that the exponent of matrix multiplication is exactly the sharp lower bound for the (log of) tensor rank of the Strassen matrix multiplication tensor. We show that Grothendieck's constant is the sharp upper bound on a tensor norm of Strassen matrix multiplication tensor. Hence these two important quantities from disparate areas of theoretical computer science Strassen's exponent of matrix multiplication and Grothendieck's constantare just different measures of the same underlying tensor. This allows us to rewrite Grothendieck's inequality as a norm inequality for a 3-tensor, which in turn gives a family of natural generalized inequalities. We show that these are locally sharp and prove that Grothendieck's inequality is unique.

### February 13, 2020

### 10:00 AM

### AP&M 2402

****************************