Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 288 C00 - Stochastic Systems Seminar (via zoom)

Eliza O'Reilly

Caltech

Random Tessellation Features and Forests

Abstract:

The Mondrian process in machine learning is a Markov partition process that recursively divides space with random axis-aligned cuts. This process is used to build random forests for regression and classification as well as Laplace kernel approximations. The construction allows for efficient online algorithms, but the restriction to axis-aligned cuts does not capture dependencies between features. By viewing the Mondrian as a special case of the stable under iterated (STIT) process in stochastic geometry, we resolve open questions about the generalization of cut directions. We utilize the theory of stationary random tessellations to show that STIT processes approximate a large class of stationary kernels and STIT random forests achieve minimax rates for Lipschitz functions (forests and trees) and $C^2$ functions (forests only). This work opens many new questions at the intersection of stochastic geometry and statistical learning theory. Based on joint work with Ngoc Mai Tran.

Host: Ruth Williams

October 14, 2021

3:00 PM

Zoom info: Please email Professor Ruth Williams

****************************