Department of Mathematics,
University of California San Diego
****************************
Advancement to Candidacy
Tongtong Liang
UCSD
Rethinking Generalization in Deep Learning: The Role of Data Geometry
Abstract:
We study how data geometry shapes generalization in overparameterized neural networks. The analysis focuses on solutions reached under stable training dynamics and the induced, data-dependent form of regularization. We link capacity to geometric features of the input distribution. This view explains when training prefers shared representations versus memorization. We present a decomposition based on depth-type notions to separate regions where learning is data-rich from regions where activation is scarce. For the uniform distribution on the ball, the framework predicts the curse of dimensionality. For mixtures supported on low-dimensional subspaces, it predicts adaptation to the intrinsic dimension. Experiments on synthetic data and MNIST support these trends. The results provide a unified account of how stability and geometry interact to govern effective capacity of GD-trained neural networks.
Advisors: Prof Alex Cloninger, Prof. Rahul Parhi (ECE), Prof. Yu-Xiang Wang (HDSI).
December 9, 2025
12:00 PM
APM 6402
****************************

