Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278C: Optimization and Data Science

Prof. Yang Liu

Lawrence Berkeley Lab

New Matrix Completion Algorithms for Highly Oscillatory Operators in Seismic and Tomographic Applications

Abstract:

Low-rank representation-based matrix or tensor completion algorithms have been developed over the past two decades for various scientific and data-science applications. Given an incomplete data matrix/tensor with missing or noisy entries but certain underlying algebraic structures, completion algorithms rely on optimization techniques to recover the full data matrix/tensor directly in a compressed representation. In the past, various compression formats have been considered such as low-rank matrix format, and Tucker, CP or tensor-train-based tensor formats. Moreover, different optimization algorithms have been exploited including alternating least squares (ALS), alternating direction filtering (ADF), nuclear norm-based optimization, Riemannian optimization and adaptive moment estimation (ADAM). Despite the success of these completion algorithms, they become less effective when dealing with highly oscillatory operators, rising from e.g., large-scale seismic or tomographic applications where physical or cost constraints limit the amount of data acquisition. This is largely due to the incapability of the abovementioned compression formats for representing non-smooth operators. Therefore, a more effective completion algorithm is called for, as the successful completion of the data matrix can significantly improve the quality of downstream algorithm pipelines for these inverse or imaging problems.   

In this talk, I will present our recent work on new completion algorithms for highly oscillatory operators (arXiv:2510.17734). In a nutshell, we consider a different compression format called butterfly for the incomplete data matrix. Butterfly formats have been proven highly effective for compressing highly oscillatory operators such as Green’s functions for high-frequency wave equations, Fourier integral operators and special function transforms, but haven’t been investigated in the matrix completion context. Our work relies on a tensor reformulation of the butterfly format into a tensor network, and we consider a variety of optimization algorithms including ALS, ADF and ADAM. Numerical results demonstrate that our butterfly completion algorithms can efficiently recover a n×n matrix representing Green’s functions or Radon transforms with only O(nlogn) observed entries in O(nlogn) operation counts. I will also discuss about the limitation and future work regarding our proposed algorithm.


Biography: Yang Liu is a staff scientist in the Scalable Solvers Group of the Applied Mathematics and Computational Research Division at the Lawrence Berkeley National Laboratory, in Berkeley, California. Dr. Liu received the Ph.D. degree in electrical engineering from the University of Michigan in 2015. From 2015 to 2017, he worked as a postdoctoral fellow at the Radiation Laboratory, University of Michigan. From 2017 to 2019, he worked as a postdoctoral fellow at the Lawrence Berkeley National Laboratory. His main research interest is in numerical linear and multi-linear algebras, computational electromagnetics and plasma, scalable machine learning algorithms, and high performance scientific computing. Dr. Liu is the lead developer of the linear solver package ButterflyPACK and autotuning package GPTune, and is a core developer for linear solver packages SuperLU_DIST and STRUMPACK. Dr. Liu is the recipient and co-recipient of the ACES Early Career Award 2025, PDSEC Best Paper Award 2025, AT-AP RASC Young Scientists Award 2022, the APS Sergei A. Schelkunoff Transactions Prize 2018, FEM first place student paper award, 2014, and the ACES second place student paper award, 2012.

Host: Jiawang Nie

March 11, 2026

4:00 PM

APM 5829 & Zoom (Meeting ID: 926 5846 1639 / Password: 278CWN26)

Research Areas

Optimization

****************************