Department of Mathematics,
University of California San Diego

****************************

PhD Defense

Bharatha Rankothge

On Localizing Subcategories of the Derived Category of Smooth Mod-$p$ Representations of a $p$-Adic Lie Group

Abstract:

Understanding the category ${Mod}_k(G)$ of the representations of a $p$-adic Lie group $G$ over a field $k$ of characteristic $p$ is integral in developing the $p$-adic and mod-$p$ Langlands programs. However, the work of Peter Schneider, Matthew Emerton, and others have suggested that we might need to shift our focus to the derived category $D(G)$ of ${Mod}_k(G)$ to make further progress. Noting that $D(G)$ is a tensor triangulated category, we follow a common practice in studying tensor triangulated categories by attempting to classify the localizing subcategories of $D(G)$. In this talk, we present such a classification for when $G$ is an abelian $p$-adic Lie group with a Noetherian augmented Iwasawa algebra.

-

Zoom (Meeting ID: 933 2879 6478)

****************************

Department of Mathematics,
University of California San Diego

****************************

Advancement to Candidacy

Tongtong Liang
UCSD

Rethinking Generalization in Deep Learning: The Role of Data Geometry

Abstract:

We study how data geometry shapes generalization in overparameterized neural networks. The analysis focuses on solutions reached under stable training dynamics and the induced, data-dependent form of regularization. We link capacity to geometric features of the input distribution. This view explains when training prefers shared representations versus memorization. We present a decomposition based on depth-type notions to separate regions where learning is data-rich from regions where activation is scarce. For the uniform distribution on the ball, the framework predicts the curse of dimensionality. For mixtures supported on low-dimensional subspaces, it predicts adaptation to the intrinsic dimension. Experiments on synthetic data and MNIST support these trends. The results provide a unified account of how stability and geometry interact to govern effective capacity of GD-trained neural networks.

-

APM 6402

****************************

Department of Mathematics,
University of California San Diego

****************************

Siddharth Vishwanath
University of California, San Diego

A Statistical Framework for Multidimensional Scaling From Noisy Data

Abstract:

Multidimensional scaling (MDS) extracts meaningful information from pairwise dissimilarity data (e.g., distances between sensors or disagreement scores between individuals) by embedding these relationships into a Euclidean space. However, in practice, the observed dissimilarities are often noisy subject to measurement errors and/or corrupted by noise, but the resulting embeddings are typically interpreted without accounting for this variation. This talk presents recent work on developing a principled statistical framework for MDS. We show that the classical MDS algorithm achieves minimax-optimal performance across a wide range of noise models and loss functions. Building on this, we develop a framework for constructing valid confidence sets for the embedded points obtained via MDS, enabling formal uncertainty quantification for geometric structure inferred from noisy relational data.

-

APM 6402 & Zoom

****************************

Department of Mathematics,
University of California San Diego

****************************

Advancement to Candidacy

JD Flynn
UCSD

A Doubly Sparse Spiked Random Matrix Model

-

APM 6402 & Zoom

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 243: Seminar in Functional Analysis

Juan Felipe Ariza Mejia
University of Iowa

McDuff superrigidity for group $II_1$ factors

Abstract:

Developing new techniques at the interface of geometric group theory and von Neumann algebras, we identify the first examples of ICC groups $G$ whose von Neumann algebras are McDuff and exhibit a new rigidity phenomenon, termed McDuff superrigidity: any arbitrary group $H$ satisfying $LG\cong LH$ must decomposes as $H=G \times A$ for some ICC amenable group $A$.  Our groups appear as infinite direct sums of $W^*$-superrigid wreath-like product groups with bounded cocycle.  In this talk I will introduce this class of groups and a natural array into a weakly-$\ell^2$ representation of the group that witnesses the bound on the 2-cocycle. I will then show how this array leads to an interplay between two deformations of the group von Neumann algebra and how these can be used to prove this class of groups satisfies infinite product rigidity.  This is joint work with Ionut Chifan, Denis Osin and Bin Sun.

-

APM 6402

****************************