Search results from PIRSA
Format results
-
-
-
Differentiable Programming Tensor Networks and Quantum Circuits
Lei Wang Chinese Academy of Sciences
-
-
Glassy and Correlated Phases of Optimal Quantum Control
Marin Bukov University of California, Berkeley
-
Neural Belief-Propagation Decoders for Quantum Error-Correcting Codes
Yehua Liu University of Sherbrooke
-
Operational quantum tomography
Olivia Di Matteo TRIUMF (Canada's National Laboratory for Particle and Nuclear Physics)
-
Machine learning phase discovery in quantum gas microscope images
Ehsan Khatami San Jose State University
-
Machine Learning Physics: From Quantum Mechanics to Holographic Geometry
Yi-Zhuang You University of California, San Diego
-
-
Deep learning and density functional theory
Isaac Tamblyn University of Ottawa
-
Machine learning ground-state energies and many-body wave function
Sebastiano Pilati University of Camerino
-
-
-
Differentiable Programming Tensor Networks and Quantum Circuits
Lei Wang Chinese Academy of Sciences
Differentiable programming makes the optimization of a tensor network much cheaper (in unit of brain energy consumption) than before [e.g. arXiv: 1903.09650]. This talk mainly focuses on the technical aspects of differentiable programming tensor networks and quantum circuits with Yao.jl (https://github.com/QuantumBFS/Yao.jl). I will also show how quantum circuits can help with contracting and differentiating tensor networks. -
-
Glassy and Correlated Phases of Optimal Quantum Control
Marin Bukov University of California, Berkeley
Modern Machine Learning (ML) relies on cost function optimization to train model parameters. The non-convexity of cost function landscapes results in the emergence of local minima in which state-of-the-art gradient descent optimizers get stuck. Similarly, in modern Quantum Control (QC), a key to understanding the difficulty of multiqubit state preparation holds the control landscape -- the mapping assigning to every control protocol its cost function value. Reinforcement Learning (RL) and QC strive to find a better local minimum of the control landscape; the global minimum corresponds to the optimal protocol. Analyzing a decrease in the learning capability of our RL agent as we vary the protocol duration, we found rapid changes in the search for optimal protocols, reminiscent of phase transitions. These "control phase transitions" can be interpreted within Statistical Mechanics by viewing the cost function as "energy" and control protocols – as "spin configurations". I will show that optimal qubit control exhibits continuous and discontinuous phase transitions familiar from macroscopic systems: correlated/glassy phases and spontaneous symmetry breaking. I will then present numerical evidence for a universal spin-glass-like transition controlled by the protocol time duration. The glassy critical point is marked by a proliferation of protocols with close-to-optimal fidelity and with a true optimum that appears exponentially difficult to locate. Using a ML inspired framework based on the manifold learning algorithm t-SNE, we visualize the geometry of the high-dimensional control landscape in an effective low-dimensional representation. Across the transition, the control landscape features an exponential number of clusters separated by extensive barriers, which bears a strong resemblance with random satisfiability problems. -
Neural Belief-Propagation Decoders for Quantum Error-Correcting Codes
Yehua Liu University of Sherbrooke
Belief-propagation (BP) decoders are responsible for the success of many modern coding schemes. While many classical coding schemes have been generalized to the quantum setting, the corresponding BP decoders are flawed by design in this setting. Inspired by an exact mapping between BP and deep neural networks, we train neural BP decoders for quantum low-density parity-check codes, with a loss function tailored for the quantum setting. Training substantially improves the performance of the original BP decoders. The flexibility and adaptability of the neural BP decoders make them suitable for low-overhead error correction in near-term quantum devices. Reference: arXiv:1811.07835 (to appear in PRL) -
Operational quantum tomography
Olivia Di Matteo TRIUMF (Canada's National Laboratory for Particle and Nuclear Physics)
As quantum processors become increasingly refined, benchmarking them in useful ways becomes a critical topic. Traditional approaches to quantum tomography, such as state tomography, suffer from self-consistency problems, requiring either perfectly pre-calibrated operations or measurements. This problem has recently been tackled by explicitly self-consistent protocols such as randomized benchmarking, robust phase estimation, and gate set tomography (GST). An undesired side-effect of self-consistency is the presence of gauge degrees of freedom, arising from the lack fiducial reference frames, and leading to large families of gauge-equivalent descriptions of a quantum gate set which are difficult to interpret. We solve this problem through introducing a gauge-free representation of a quantum gate set inspired by linear inversion GST. This allows for the efficient computation of any experimental frequency without a gauge fixing procedure. We use this approach to implement a Bayesian version of GST using the particle filter approach, which was previously not possible due to the gauge. Within Bayesian GST, the prior information allows for inference on tomographically incomplete data sets, such as Ramsey experiments, without giving up self-consistency. We demonstrate the stability and generality of both our gauge-free representation and Bayesian GST by simulating a number of common characterization protocols, such as randomized benchmarking, as well characterizing a trapped-ion qubit using experimental data. Sandia National Labs is managed and operated by National Technology and Engineering Solutions of Sandia, LLC, a subsidiary of Honeywell International, Inc., for the U.S. Dept. of Energy’s National Nuclear Security Administration under contract DE-NA0003525. The views expressed in this presentation do not necessarily represent the views of the DOE, the ODNI, or the U.S. Government. This material was funded in part by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research Quantum Testbed Program. Olivia Di Matteo, TRIUMF, Vancouver, BC, Canada and Microsoft Research, Redmond, WA, USA John Gamble, Microsoft Research, Redmond, WA, USA Chris Granada, Microsoft Research, Redmond, WA, USA Kenneth Ruddinger, Quantum Performance Laboratory, Sandia National Laboratories, Albuquerque, NM, USA Nathan Wiebe, Microsoft Research, Redmond, WA, USA -
Machine learning phase discovery in quantum gas microscope images
Ehsan Khatami San Jose State University
Site resolution in quantum gas microscopes for ultracold atoms in optical lattices have transformed quantum simulations of many-body Hamiltonians. Statistical analysis of atomic snapshots can produce expectation values for various charge and spin correlation functions and have led to new discoveries for the Hubbard model in two dimensions. Conventional approaches, however, fail in general when the order parameter is not known or when an expected phase has no clear signatures in the density basis. In this talk, I will introduce our efforts in using machine learning techniques to overcome this challenge with snapshots of fermionic atoms. Collaborators: Richard Scalettar (UC Davis), Waseem Bakr (Princeton), and Juan Carrasquilla (Vector Institute) -
Machine Learning Physics: From Quantum Mechanics to Holographic Geometry
Yi-Zhuang You University of California, San Diego
Inspired by the "third wave" of artificial intelligence (AI), machine learning has found rapid applications in various topics of physics research. Perhaps one of the most ambitious goals of machine learning physics is to develop novel approaches that ultimately allows AI to discover new concepts and governing equations of physics from experimental observations. In this talk, I will present our progress in applying machine learning technique to reveal the quantum wave function of Bose-Einstein condensate (BEC) and the holographic geometry of conformal field theories. In the first part, we apply machine translation to learn the mapping between potential and density profiles of BEC and show how the concept of quantum wave function can emerge in the latent space of the translator and how the Schrodinger equation is formulated as a recurrent neural network. In the second part, we design a generative model to learn the field theory configuration of the XY model and show how the machine can identify the holographic bulk degrees of freedom and use them to probe the emergent holographic geometry. -
Attention is all you get
Paul Ginsparg Cornell University
For the past decade, there has been a new major architectural fad in deep learning every year or two. One such fad for the past two years has been the transformer model, an implementation of the attention method which has superseded RNNs in most sequence learning applications. I'll give an overview of the model, with some discussion of non-physics applications, and intimate some possibilities for physics. -
Deep learning and density functional theory
Isaac Tamblyn University of Ottawa
Density functional theory is a widely used electronic structure method for simulating and designing nanoscale systems based on first principles. I will outline our recent efforts to improve density functionals using deep learning. Improvement would mean achieving higher accuracy, better scaling (with respect to system size), improved computational parallelizability, and achieving reliable performance transferability across different electronic environments. To this end, we have generated a large and diverse dataset of 2d simulations of electrons (http://clean.energyscience.ca/datasets) with a varying number of electrons in confining potentials for several (approximate) density functionals. As a proof-of-principal, we have used extensive deep neural networks to reproduce the results of these simulations to high accuracy at significantly reduced computational cost. By learning the screening length-scale of the electrons directly from the data, we are able to train on small-scale calculations, yet perform inference at effectively arbitrary length-scales at only O(N) cost. This overcomes a key-scaling limitation of Kohn-Sham DFT (which scales as O(N^3)), paving the way for accurate, large scale ab initio enabled design of nanoscale components and devices. -
Machine learning ground-state energies and many-body wave function
Sebastiano Pilati University of Camerino
In the first part of this presentation, I will present supervised machine-learning studies of the low-lying energy levels of disordered quantum systems. We address single-particle continuous-space models that describe cold-atoms in speckle disorder, and also 1D quantum Ising glasses. Our results show that a sufficiently deep feed-forward neural network (NN) can be trained to accurately predict low-lying energy levels. Considering the long-term prospect of using cold-atoms quantum simulator to train neural networks to solve computationally intractable problems, we consider the effect of random noise in the training data, finding that the NN model is remarkably resilient. We explore the use of convolutional NN to build scalable models and to accelerate the training process via transfer learning. In the second part, I will discuss how generative stochastic NN, specifically, restricted and unrestricted Boltzmann machines, can be used as variational Ansatz for the ground-state many-body wave functions. In particular, we show how to employ them to boost the efficiency of projective quantum Monte Carlo (QMC) simulations, and how to automatically train them within the projective QMC simulation itself. SP, P. Pieri, Scientific Reports 9, 5613 (2019) E. M. Inack, G. Santoro, L. Dell’Anna, SP, Physical Review B 98, 235145 (2018)