The Renormalization Group (RG) is a cornerstone of physics which quantifies the sensitivity of a theory to its various parameters as a function of scale. Conventionally, the scale appearing in RG is a physical length scale with low energy effective theories arising by integrating out degrees of freedom at short distances. However, the reliance of RG on notions of locality renders it ineffective when dealing with theories which are either non-local or don’t appeal to any notion of physical proximity at all. This includes gravitational systems in which local coarse-graining RG schemes are disqualified by the requirement of diffeomorphism invariance. In this talk, we introduce a generalization of standard RG in which coarse-graining is defined with respect to a chosen measure of information theoretic distinguishability, rather than an immediate physical scale. This point of view allows for the logic of renormalization to be applied successfully to a wider set of systems, while also reproducing standard RG schemes in contexts where spatially local coarse-graining is reasonable such as local Quantum Field Theories. To illustrate the usefulness of this more general class of RG schemes, we introduce a field theory for statistical inference which may be regarded as a meta-theory over the parameters of any statistical model. In this case, the Fisher information metric provides an emergent notion of locality in the space of models that can readily be used as a scale for a Wilsonian inspired RG scheme. We conclude by applying the aforementioned RG scheme to perform an information theoretic renormalization of a Neural Network, and discuss implications for data science and physics.
Based on:
On the Dynamics of Inference and Learning by David Berman, Jonathan Heckman, and MK
The Inverse of Exact Renormalization Group Flows as Statistical Inference by David Berman and MK
Bayesian Renormalization by David Berman, MK, and Alexander Stapleton