C3.ai DTI's Digital Transformation Science Colloquium Series features free weekly talks from researchers who work w/ AI to benefit society. Join us on Zoom Thursday, February 25 at 3:00 p.m. and hear from OpenStax Founder and Rice U Prof. Richard Baraniuk on a talk titled "Mad Max: Affine Spline Insights into Deep Learning." Learn about deep network theories and signal organization. Registration is required to attend this event.
Abstract: We build a rigorous bridge between deep networks (DNs) and approximation theory via spline functions and operators. Our key result is that a large class of DNs can be written as a composition of max-affine spline operators (MASOs), which provide a powerful portal through which to view and analyze their inner workings. For instance, conditioned on the input signal, the output of a MASO DN can be written as a simple affine transformation of the input. This implies that a DN constructs a set of signal-dependent, class-specific templates against which the signal is compared via a simple inner product; we explore the links to the classical theory of optimal classification via matched filters and the effects of data memorization. The spline partition of the input signal space that is implicitly induced by a MASO directly links DNs to the theory of vector quantization and K-means clustering, which opens up new geometric avenue to study how DNs organize signals in a hierarchical and multiscale fashion.