2025/2026 SEMINARS

FALL

WINTER

SPRING

Math 208 - Algebraic Geometry

Oprea, Dragos

Oprea, Dragos

Oprea, Dragos

Math 209 - Number Theory

Bucur, Alina

Bucur, Alina

Bucur, Alina

Math 211A - Algebra

Golsefidy, Alireza

Golsefidy, Alireza

Golsefidy, Alireza

Math 211B - Group Actions

Frisch, Joshua

Frisch, Joshua

Frisch, Joshua

Math 218 - Biological Systems

Miller, Pearson

Miller, Pearson

Miller, Pearson

Math 243 - Functional Analysis

Ganesan, Priyanga & Vigdorovich, Itamar

Ganesan, Priyanga & Vigdorovich, Itamar

Vigdorovich, Itamar

Math 248 - Real Analysis

Bejenaru, Ioan

Bejenaru, Ioan

Bejenaru, Ioan

Math 258 - Differential Geometry

Spolaor, Luca

Spolaor, Luca

Spolaor, Luca

Math 268 - Logic

TBD

TBD

TBD

Math 269 - Combinatorics

Rhoades, Brendon & Warnke, Lutz

Rhoades, Brendon & Warnke, Lutz

Rhoades, Brendon & Warnke, Lutz

Math 278A - CCoM

Cheng, Li-Tien

Cheng, Li-Tien

Cheng, Li-Tien

Math 278B - Math of Info, Data

Cloninger, Alexander

Cloninger, Alexander

Cloninger, Alexander

Math 278C - Optimization

Nie, Jiawang

Nie, Jiawang

Nie, Jiawang

Math 288A - Probability

Peca-Medlin, John

Peca-Medlin, John

Peca-Medlin, John

Math 288B - Statistics

TBD

TBD

TBD

Math 292 - Topology Seminar

Chow, Bennett

Chow, Bennett

Chow, Bennett

Thu, Apr 16 2026
  • 11:00 am
    Dr. Florian Kogelbauer - ETH Zürich
    Hydrodynamic Manifolds for Kinetic Equations

    Math 248: Real Analysis Seminar

    APM 7321

    We discuss recent developments around Hilbert's sixth problem about the passage from kinetic models to macroscopic fluid equations. We employ the technique of slow spectral closure to rigorously establish the existence of hydrodynamic manifolds and derive new non-local fluid equations for rarefied flows independent of Knudsen number. We show the singularity of certain scaled solutions, including the divergence of the Chapman--Enskog series for an explicit example, and apply neural nets to learn the optimal hydrodynamic closure from data. The new dynamically optimal constitutive laws are applied to a rarefied flow problem and we discuss the classical problem of the number of macroscopic rarefied fluid fields from a data-driven point of view.

    Bio: Florian Kogelbauer is a Senior Research Fellow at ETH Zürich’s Department of Mathematics, affiliated with RiskLab and the Finsure Tech Hub. His research centres on nonlinear dynamical systems, kinetic theory, and fluid dynamics, with recent work on hydrodynamic closures and spectral theory for kinetic equations. He previously held academic and research roles at the University of Vienna and AIST-Tohoku University in Japan, alongside consulting positions at KPMG Austria.

  • 4:00 pm
    Dr. Lihan Wang - California State University Long Beach
    What Can We Hear About the Boundary?

    Math 248: Real Analysis Seminar

    APM 7218

    In 1966, Mark Kac asked the famous question “Can one hear the shape of a drum?”
In his article with this question as the title, he translated it into eigenvalue problems for planar domains.
This question highlighted the relationship between eigenvalues and geometry.
One can then ask how eigenvalues are related to the geometry of the boundary.
    In this talk, we consider a special type of eigenvalues, called Steklov eigenvalues, that are closely tied to boundary geometry.
We will introduce Steklov eigenvalues and explain their basic background and applications.
Then we will discuss our recent results on inequalities relating Steklov eigenvalues to the boundary area of compact manifolds.

Tue, Apr 21 2026
Thu, Apr 23 2026
  • 11:00 am
    Haixiao Wang - University of Wisconsin
    Spectral Embeddings via Random Geometric Graphs for Noisy, High-Dimensional, and Nonlinear Datasets with Applications

    Math 288: Probability & Statistics

    APM 6402

    Clustering is one of the fundamental problems in statistics and machine learning. Classical generative models such as the Stochastic Block Model (SBM) and Gaussian Mixture Model (GMM) are widely used for synthetic data generation and theoretical evaluation, but much of the literature assumes linearly separable clusters---an assumption that can fail in the presence of nonlinear geometry. We study a nonlinear multi-manifold model in which disjoint manifolds represent different clusters and the observations are corrupted by high-dimensional noise. We propose a kernel-based spectral embedding algorithm, based on the Random Geometric Graph (RGG) constructed from the data. Following the framework established by Ding and Ma (2023), we show that the embedding converges to its noiseless counterpart when the signal-to-noise ratio is sufficiently large. For downstream tasks, the embedding can be used for community detection problems. When different manifolds are sufficiently separated, the procedure recovers the community structure with vanishing error. Based on joint work with Xiucai Ding.

  • 4:00 pm
    Shangjie Zhang
    Computations in equivariant stable homotopy theory

    PhD Defense

    APM 7218

    This dissertation consists of four papers that develop computational and structural results in equivariant stable homotopy theory. The results include the computation of the reduced ring of the $RO(C_2)$-graded $C_2$-equivariant stable stems, the construction of the first family of $C_{p^n}$-equivariant ``$v_1$''-self maps, the computation of the $C_{p^n}$-equivariant Mahowald invariants of all elements in the Burnside ring, extending the classical computations of Bredon--Landweber and Iriye, and the computation of the spoke-graded $C_3$-equivariant stable stems.

Tue, Apr 28 2026
  • 11:00 am
    Changying Ding - UCLA
    TBA

    Math 243: Functional Analysis Seminar

    APM 6402

Wed, Apr 29 2026
  • 11:00 am
    Dietmar Bisch - Vanderbilt University
    TBA

    Math 243: Functional Analysis Seminar

    APM 6402

Mon, May 4 2026
  • 3:00 pm
    Prof. Alexander Kiselev - Duke University
    Singularity suppression by fluid flow

    Math 295: Colloquium Seminar

    APM 6402

    Transport by fluid flow can provide one of the less understood regularization mechanisms in PDE. In this talk, I will focus on the 2D Keller-Segel equation for chemotaxis set on a general domain and coupled via buoyancy with the fluid obeying Darcy's law - a much studied model of the incompressible fluid flow in porous media. It is well known that solutions to the 2D Keller-Segel equation can form singularities in finite time if the mass of the initial data is larger than critical. It turns out that if the equation is coupled with fluid flow obeying Darcy's law via buoyancy, this completely regularizes the system, leading to globally regular solutions for arbitrarily large initial data. One of the key ingredients in the proof is a new generalized Nash inequality, which employs anisotropic norm that is natural in the context of the incompressible porous media flow. This talk is based on works joint with Kevin Hu, Naji Sarsam, and Yao Yao.

Tue, May 5 2026
  • 11:00 am
    Alonso Delfin - CU Boulder
    TBA

    Math 243: Functional Analysis Seminar

    APM 6402

Thu, May 7 2026
  • 2:30 pm
    David Stephens - UC San Diego
    A Simplified Proof of the Erdős Sumset Conjecture

    Undergraduate Honors Presentation

    APM 5829

    In this talk, we will discuss an ergodic proof of the Sumset Conjecture of Erdős, which asks if every set $A \subseteq \mathbb{N}$ with positive density contains $B + C$ for some $B,C \subseteq \mathbb{N}$ infinite. This result was originally proved by Moreira, Richter, and Robertson in 2019 using ultrafilters, however in this proof we will adapt the method of progressive measures recently developed by Kra, Moreira, Richter, and Robertson. We closely follow their proof, simplifying what we can along the way.

Mon, May 11 2026
  • 3:00 pm
    Professor Peter Bartlett - UC Berkeley
    Modern machine learning methods: large step-size optimization, implicit bias, and benign overfitting

    Murray and Adylin Rosenblatt Endowed Lectures in Applied Mathematics

    Kavli Auditorium, Tata Hall, UCSD

    The impressive performance of modern machine learning methods seems to arise through different mechanisms from those of classical statistical learning theory, mathematical statistics, and optimization theory. Simple gradient methods find excellent solutions to non-convex optimization problems, and without any explicit effort to control model complexity they exhibit excellent prediction performance in practice. This talk will describe recent progress in statistical learning theory and optimization theory that demonstrates the optimization benefits of step-sizes that are too large to allow gradient methods to be viewed as an accurate time discretization of a gradient flow differential equation, that characterizes the solutions that are favored by gradient optimization methods, and that illustrates when those solutions can overfit training data but still provide good predictive accuracy.

Tue, May 12 2026
  • 11:00 am
    Rufus Wilett - University of Hawai'i
    TBA

    Math 243: Functional Analysis Seminar

    APM 6402