The Large Hadron Collider (LHC) continues to operate at unprecedented scales, aiming to collect a 10x factor of the already collected data and with a slight energy increase. Despite extensive analysis, no clues of New Physics have emerged from the data. As a result, some of the primary challenges in High Energy Physics have become: i) enhancing the experimental understanding and precision of the detector, and ii) extracting more information from the data than is currently being extracted. While Machine Learning (ML) tools are extensively utilized to tackle these challenges, many depend heavily on simulation-derived patterns. Although simulations closely match the data in individual distributions, their accuracy in representing complex correlations can be limited. Neural Networks, with their ability to detect intricate correlations, might misinterpret these simulated patterns as real and subsequently search for them in actual data. In this talk, we will explore how Bayesian ML tools and techniques offer a novel and promising avenue for extracting information from data. We will present the principles and methodologies of this Bayesian craft, highlighting how to unfold a physical system as a probabilistic model to inject relevant prior information in each corresponding latent variable, thereby improving the performance of the model. The presented approach will be illustrated with the process pp > hh > bbbb at the LHC, where Bayesian methods show considerable promise for improved results.
References: 2404.01387, 2402.08001 and work in progress