The Anthony J Leggett Institute for Condensed Matter Special Theory Seminar, Veit Elser (Cornell), "Solving Problems with Projections: From phase retrieval to neural networks"

- Sponsor
- Physics - The Anthony J Leggett Institute for Condensed Matter Theory
- Views
- 29
- Originating Calendar
- Physics - The Anthony J Leggett Institute for Condensed Matter Theory Seminar
Abstract: Optimization, by minimizing a function with steps along the gradient, is by far the most widely used method for solving problems. When applied to the training of neural networks it is better known as “error back-propagation”. Even 40 years after it was introduced, the back-propagation formula is still the main consumer of power in AI data centers. Might there be something better?
This talk is about a little known method that so far has mostly been used in phase retrieval: the reconstruction of a signal from its Fourier transform magnitudes and structural constraints. The structures in the Protein Data Bank would not exist without phase retrieval applied to X-ray diffraction intensity data. Around 2001 I realized that the leading algorithm for phase retrieval was doing something very different from gradient descent, and that this method could also be used for packing spheres, coloring graphs, finding spin-glass ground states, and even solving sudoku puzzles.
I will start by introducing the operations that replace gradient steps, called projections, and the iterative scheme that uses them to solve problems. After reconstructing an aperture from the intensity pattern in a diffraction experiment, the talk will move to the training of neural networks. Though there are many similarities with the existing technology, like the scaling of work with network and data size, the projection-based method has a distinct advantage in being able to use Boolean variables and logic-based representations for learning.Zoom Link: https://illinois.zoom.us/my/icmt.seminar?pwd=ZU1KbnBLeXZLUmJKc0oyU205cDNDdz09
Meeting ID: 791 382 8328
Password: 106237