PIRSA:23020061

Replacing neural networks by optimal predictive models for the detection of phase transitions

APA

Arnold, J. (2023). Replacing neural networks by optimal predictive models for the detection of phase transitions. Perimeter Institute for Theoretical Physics. https://pirsa.org/23020061

MLA

Arnold, Julian. Replacing neural networks by optimal predictive models for the detection of phase transitions. Perimeter Institute for Theoretical Physics, Feb. 24, 2023, https://pirsa.org/23020061

BibTex

          @misc{ scivideos_PIRSA:23020061,
            doi = {10.48660/23020061},
            url = {https://pirsa.org/23020061},
            author = {Arnold, Julian},
            keywords = {Other Physics},
            language = {en},
            title = {Replacing neural networks by optimal predictive models for the detection of phase transitions},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2023},
            month = {feb},
            note = {PIRSA:23020061 see, \url{https://scivideos.org/index.php/pirsa/23020061}}
          }
          

Julian Arnold Universität Basel

Talk numberPIRSA:23020061
Source RepositoryPIRSA
Talk Type Scientific Series
Subject

Abstract

In recent years, machine learning has been successfully used to identify phase transitions and classify phases of matter in a data-driven manner. Neural network (NN)-based approaches are particularly appealing due to the ability of NNs to learn arbitrary functions. However, the larger an NN, the more computational resources are needed to train it, and the more difficult it is to understand its decision making. Thus, we still understand little about the working principle of such machine learning approaches, when they fail or succeed, and how they differ from traditional approaches. In this talk, I will present analytical expressions for the optimal predictions of three popular NN-based methods for detecting phase transitions that rely on solving classification and regression tasks using supervised learning at their core. These predictions are optimal in the sense that they minimize the target loss function. Therefore, in practice, optimal predictive models are well approximated by high-capacity predictive models, such as large NNs after ideal training. I will show that the analytical expressions we have derived provide a deeper understanding of a variety of previous NN-based studies and enable a more efficient numerical routine for detecting phase transitions from data.

Zoom Link: https://pitp.zoom.us/j/91642481966?pwd=alkrWEFFcFBvRlJEbDRBZWV3MFFDUT09