PMML Reading Group
The Pure Mathematics in Machine Learning (PMML) research team aims to study over-parameterized neural networks to derive mathematical guarantees for these learners. Our work focuses on spectral machine learning theory to understand the empirical eigen-distribution for classes of matrices induced by these networks such as the weight matrices and conjugate kernels, the neural tangent kernel (which influences gradient flow during derivative based training), and the Hessian of the loss landscape (which describes the local geometry for the loss landscape).
Understanding the mathematical guarantees provides a canonical method for reproducible models which, in turn, promotes fidelity in these models. These guarantees can be viewed as a foundational structure and determining a structure theory for this class of machine learners will have a significant impact on how these models are developed, trained, and ultimately, generalized.
Statistical Inference and Classical Bias/Variance Tradeoffs
November 3, 2020
Anand Sarwate, Associate Professor, The State University of New Jersey
Random Matrices: Universality and Integrability
November 10, 2020
Ioana Dumitriu, Professor of Mathematics, University of California, San Diego
Model Fitting
November 17, 2020
Anand Sarwate, Associate Professor, The State University of New Jersey
Random Matrices: Basic Tools for Computing in the ESD
December 1, 2020
Ioana Dumitriu, Professor of Mathematics, University of California, San Diego
A Quick Introduction to Learning Theory
December 10, 2020
Anand Sarwate, Associate Professor, The State University of New Jersey