PIRSA:19070012

Deep learning and density functional theory

APA

Tamblyn, I. (2019). Deep learning and density functional theory. Perimeter Institute for Theoretical Physics. https://pirsa.org/19070012

MLA

Tamblyn, Isaac. Deep learning and density functional theory. Perimeter Institute for Theoretical Physics, Jul. 11, 2019, https://pirsa.org/19070012

BibTex

          @misc{ scivideos_PIRSA:19070012,
            doi = {10.48660/19070012},
            url = {https://pirsa.org/19070012},
            author = {Tamblyn, Isaac},
            keywords = {Quantum Matter},
            language = {en},
            title = {Deep learning and density functional theory},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2019},
            month = {jul},
            note = {PIRSA:19070012 see, \url{https://scivideos.org/index.php/pirsa/19070012}}
          }
          

Isaac Tamblyn University of Ottawa

Talk numberPIRSA:19070012
Source RepositoryPIRSA
Talk Type Conference

Abstract

Density functional theory is a widely used electronic structure method for simulating and designing nanoscale systems based on first principles. I will outline our recent efforts to improve density functionals using deep learning. Improvement would mean achieving higher accuracy, better scaling (with respect to system size), improved computational parallelizability, and achieving reliable performance transferability across different electronic environments. To this end, we have generated a large and diverse dataset of 2d simulations of electrons (http://clean.energyscience.ca/datasets) with a varying number of electrons in confining potentials for several (approximate) density functionals. As a proof-of-principal, we have used extensive deep neural networks to reproduce the results of these simulations to high accuracy at significantly reduced computational cost. By learning the screening length-scale of the electrons directly from the data, we are able to train on small-scale calculations, yet perform inference at effectively arbitrary length-scales at only O(N) cost. This overcomes a key-scaling limitation of Kohn-Sham DFT (which scales as O(N^3)), paving the way for accurate, large scale ab initio enabled design of nanoscale components and devices.