Zoom URL: https://illinois.zoom.us/j/85236768481?pwd=RndkQzd2Vk5WMDNTdC9MQWZUemVjdz09
Meeting ID: 852 3676 8481
Speaker: Nima Anari
Title: Entropic Independence in Combinatorial Distributions
I will introduce a notion for combinatorial distributions that we call entropic independence. This notion is motivated by the desire to obtain tight mixing time bounds for natural random walks that can sample from these distributions. As is widely known in Markov Chain analysis, spectral analysis is often lossy (by polynomial factors) when the state space is exponentially large. Instead, Modified Log-Sobolev Inequalities (MLSI), which characterize the rate of entropy decay in a random walk, are a powerful family of inequalities that often yield the correct mixing time. We show that as long as the family of combinatorial distributions under study "accommodates weights”, entropic independence can be obtained with no extra effort from the much more manageable notion of spectral independence; the latter notion has been key to many breakthroughs in counting and sampling and has been unifying many classic Markov chain analysis techniques. I will talk about natural distributions where using our technique we obtain tight mixing time bounds for the first time: monomers in monomer-dimer systems, the high-temperature Ising model, the high-temperature hardcore model, and (non-symmetric) determinantal point processes. Time-permitting, I will briefly mention two further algorithmic applications of entropic independence: accelerating sampling time for multiple samples, and parallel algorithms for sampling determinant-based distributions.