Login with HarvardKey to view all events.

Functional dimension of ReLU Networks

This is a past event.

Tuesday, February 28, 2023 4pm to 5pm

Fashion by Marina Debris

Event Dates

Tuesday, February 28, 2023 4pm to 5pm

View map Add to calendar

Elisenda Grigsby, Professor of Mathematics, Boston College

Feedforward neural networks with ReLU activation are a class of parameterized functions that have proven remarkably successful in supervised learning tasks. They do so even in regimes where classical notions of complexity like the parametric dimension indicate that they ought to be overfitting the training data.

In this talk, we argue that--contrary to conventional intuition--parametric dimension is a highly inadequate complexity measure for the class of ReLU neural network functions. We introduce the notion of the local functional dimension of a ReLU network parameter, discuss its relationship to the geometry of the underlying decomposition of the domain into linear regions, and present some preliminary experimental results suggesting that functional dimension is highly inhomogeneous for many architectures. Moreover, this inhomogeneity should have significant implications for the dynamics of training ReLU networks via gradient descent.

Some of this work is joint with Kathryn Lindsey, Rob Meyerhoff, and Chenxi Wu, and some is joint with Kathryn Lindsey and David Rolnick.

Event Details