We present a recent line of work on estimating differential networks and conducting statistical inference about parameters in a high-dimensional setting. First, we consider a Gaussian setting and show how to directly learn the difference between the graph structures. A debiasing procedure will be presented for construction of an asymptotically normal estimator of the difference. Unlike the existing approaches for differential network inference that require sparsity of individual precision matrices from both groups, we only require sparsity of the precision matrix difference. This allows for applications in cases where individual networks are not sparse, such as networks that contain hub nodes, but the differential network is sparse. We discuss the methods' theoretical properties, evaluate its performance in numerical studies and highlight directions for future research. Next, we discuss a methodology for performing valid statistical inference for difference between parameters of generak Markov network in a high-dimensional setting based on the regularized Kullback-Leibler Importance Estimation Procedure that allows us to directly learn the parameters of the differential network, without requiring for separate or joint estimation of the individual Markov network parameters. We prove that our estimator is regular and its distribution can be well approximated by a normal under wide range of data generating processes and, in particular, is not sensitive to model selection mistakes. Furthermore, we develop a new testing procedure for equality of Markov networks, which is based on a max-type statistics. A valid bootstrap procedure is developed that approximates quantiles of the test statistics.
Mladen Kolar is Associate Professor of Econometrics and Statistics at the University of Chicago Booth School of Business. His research is focused on high-dimensional statistical methods, graphical models, varying-coefficient models and data mining, driven by the need to uncover interesting and scientifically meaningful structures from observational data. Particular applications arise in studies of dynamic regulatory networks and social media analysis. His research has appeared in several publications including the Journal of Machine Learning Research, Annals of Statistics, Annals of Applied Statistics, and the Electronic Journal of Statistics. He also regularly presents his research at the top machine learning conferences, including Advances in Neural Information Processing Systems and the International Conference of Machine Learning. Kolar was awarded a prestigious Facebook Fellowship in 2010 for his work on machine learning and network models. He spent a summer with Facebook’s ads optimization team working on a large scale system for click-through rate prediction. His other past research included work with INRIA Rocquencourt in Paris, France and Joint Research Center in Ispra, Italy. Kolar earned his PhD in Machine Learning in 2013 from Carnegie Mellon University, as well as a diploma in Computer Engineering from the University of Zagreb. For his Ph.D. thesis work on “Uncovering Structure in High-Dimensions: Networks and Multi-task Learning Problems,” Kolar received from 2014 SIGKDD Dissertation Award honorable mention.