Title: Gradient Flossing: Regularizing the Lyapunov Spectrum to Improve RNN Training
Abstract: The exploding and vanishing gradient problem in recurrent neural networks can be characterized by the Lyapunov spectrum of the network's dynamics. The long-term Jacobian products in backpropagation are directly related to these exponents, which describe the average growth or shrinkage of perturbations.
In this talk, I will introduce Gradient Flossing, a regularization method that directly stabilizes training by controlling the top Lyapunov exponents of the network. By adding a loss term that penalizes non-zero exponents, we can improve the condition number of the long-term Jacobian. I will show empirical results demonstrating that this technique allows RNNs to learn dependencies across significantly longer time horizons than standard training methods. This work offers a practical, dynamics-based tool for addressing a long-standing challenge in training recurrent models.
Speaker Bio: Rainer Engelken specializes in computational models with complex biological and engineering systems; stability and structure in network architectures; and nonlinear network dynamics. His research group seeks to bridge learning theories in neuroscience with the design of robust and adaptive computational systems. Engelken holds a Ph.D. in Physics from the University of Göttingen in Germany.