Format results
-
-
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
-
Ultracold Molecules: From Quantum Chemistry to Quantum Computing
Alan Jamison Institute for Quantum Computing (IQC)
-
Toward a Quantum-Safe Future
Michele Mosca Institute for Quantum Computing (IQC)
-
The Emperor's New Crown: What Covid-19 Reveals
Brandon Ogbunu Brown University
-
-
-
Fundamental Constraints for Fundamental Theories
Rachel Rosen Carnegie Mellon University
-
PSI 2019/2020 - Computational Physics - Lecture 15
Erik Schnetter Perimeter Institute for Theoretical Physics
-
The Second Mission of Supersymmetry: Miracles in Uncharted Waters
Mikhail Shifman University of Minnesota
-
PSI 2019/2020 - Computational Physics - Lecture 14
Erik Schnetter Perimeter Institute for Theoretical Physics
-
Entanglement entropy of highly excited eigenstates of many-body lattice Hamiltonians
Marcos Rigol Pennsylvania State University
-
Cayley path and quantum supremacy: Average case #P-Hardness of random circuit sampling
Ramis Movassagh MIT-IBM Watson AI Lab
Given the large push by academia and industry (e.g., IBM and Google), quantum computers with hundred(s) of qubits are at the brink of existence with the promise of outperforming any classical computer. Demonstration of computational advantages of noisy near-term quantum computers over classical computers is an imperative near-term goal. The foremost candidate task for showing this is Random Circuit Sampling (RCS), which is the task of sampling from the output distribution of a random circuit. This is exactly the task that recently Google experimentally performed on 53-qubits.
Stockmeyer's theorem implies that efficient sampling allows for estimation of probability amplitudes. Therefore, hardness of probability estimation implies hardness of sampling. We prove that estimating probabilities to within small errors is #P-hard on average (i.e. for random circuits), and put the results in the context of previous works.
Some ingredients that are developed to make this proof possible are construction of the Cayley path as a rational function valued unitary path that interpolate between two arbitrary unitaries, an extension of Berlekamp-Welch algorithm that efficiently and exactly interpolates rational functions, and construction of probability distributions over unitaries that are arbitrarily close to the Haar measure. -
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
Basic statistical properties of quantum many-body systems in thermal equilibrium can be obtained from their partition function. In this talk, I will present a quasi-polynomial time classical algorithm that estimates the partition function of quantum many-body systems at temperatures above the thermal phase transition point. It is known that in the worst case, the same problem is NP-hard below this temperature. This shows that the transition in the phase of a quantum system is also accompanied by a transition in the computational hardness of estimating its statistical properties. The key to this result is a characterization of the phase transition and the critical behavior of the system in terms of the complex zeros of the partition function. I will also discuss the relation between these complex zeros and another signature of the thermal phase transition, namely, the exponential decay of correlations. I will show that in a system of n particles above the phase transition point, where the complex zeros are far from the real axis, the correlation between two observables whose distance is at least log(n) decays exponentially. This is based on joint work with Aram Harrow and Saeed Mehraban.
-
Ultracold Molecules: From Quantum Chemistry to Quantum Computing
Alan Jamison Institute for Quantum Computing (IQC)
Cooling atomic gases to ultracold temperatures revolutionized the field of atomic physics, connecting with and impacting many other areas in physics. Advances in producing ultracold molecules suggest similarly dramatic discoveries are on the horizon. First, I will review the physics of ultracold molecules, including our work bringing a new class of molecules to nanokelvin temperatures. Chemistry at these temperatures has a very different character than at room temperature. One striking effect is our recent result using spin states of reactants to control chemical reaction pathways. I will also describe how the strong electric dipole moments of ultracold molecules present an exciting new tool for quantum information and quantum computing.
-
Toward a Quantum-Safe Future
Michele Mosca Institute for Quantum Computing (IQC)
There has been tremendous progress in the many layers needed to realize large-scale quantum computing, from the hardware layers to the high level software. There has also been vastly increased exploration into the potentially useful applications of quantum computers, which will drive the desire to build quantum computers and make them available to users. I will describe some of my research in quantum algorithmics and quantum compiling.
The knowledge and tools developed for these positive applications give us insight into the cost of implementing quantum cryptanalysis of today's cryptographic algorithms, which is a key factor in estimating when quantum computers will be cryptographically relevant (the "collapse time"). In addition to my own estimates, I will summarize the estimates of 22 other thought leaders in quantum computing.
What quantum cryptanalysis means to an organization or a sector depends not only on the collapse time, but also on the time to migrate to quantum-safe algorithms as well as the shelf-life of information assets being protected. In recent years, we have gained increasing insight into the challenges of a wide-scale migration of existing systems. We must also be proactive as we deploy new systems. Open-source platforms, like OpenQuantumSafe and OpenQKDNetwork, are valuable resources in helping meet many of these challenges.
While awareness of the challenges and the path forward has increased immensely, there is still a long road ahead as we work together with additional stakeholders not only to prepare our digital economy to be resilient to quantum attacks, but also to make us more resilient to other threats that emerge. -
The Emperor's New Crown: What Covid-19 Reveals
Brandon Ogbunu Brown University
As of late March 2020, Covid-19 has already secured its status among the most expansive pandemics of the last century. Covid-19 is caused by a coronavirus--SARS-CoV-2--that causes a severe respiratory disease in a fraction of those infected, and is typified by several important features: ability to infect cells of various kinds, contagiousness prior to the onset of symptoms, and a widely varying experience with disease across patient demographics.
In this seminar, I discuss the many lessons that the scientific community has learned from Covid-19, including insight from molecular evolution, cell biology, and epidemiology. I discuss the role of mathematical and computational modeling efforts in understanding the trajectory of the epidemic, and highlight modern findings and potential research questions at the interface of virology and materials science. I will also introduce areas of inquiry that might be of interest to the physics community.
-
Single-Shot-Decoding with High Thresholds in LDPC Quantum Codes with Constant Encoding Rate
Nikolas Breuckmann University College London
It is believed that active quantum error correction will be an essential ingredient to build a scalable quantum computer. The currently favored scheme is the surface code due to its high decoding threshold and efficient decoding algorithm. However, it suffers from large overheads which are even more severe when parity check measurements are subject to errors and have to be repeated. Furthermore, the number of encoded qubits in the surface code does not grow with system size, leading to a sub-optimal use of the physical qubits.
Finally, the decoding algorithm, while efficient, has non-trivial complexity and it is not clear whether it can be implemented in hardware that can keep up with the classical processing.
We present a class of low-density-parity check (LDPC) quantum codes which fix all three of the concerns mentioned above. They were first proposed in [1] and called 4D hyperbolic codes, as their definition is based on four-dimensional, curved geometries. They have the remarkable property that the number of encoded qubits grows linearly with system size, while their distance grows polynomially with system size, i.e. d~n a with 0.1 < a < 0.3. This is remarkable since it was previously conjectured that such codes could not exist [1]. Their structure allows for decoders which can deal with erroneous syndrome measurements, a property called single-shot error correction [2] as well as local decoding schemes [3].
Although [1] analyzed the encoding rate and distance of this code family abstractly, it is a non-trivial task to actually construct them. There is no known efficient deterministic procedure for obtaining small examples. Only single examples of reasonable size had been obtained previously [4]. These previous examples were part of different code families, so that it was not possible to determine a threshold. We succeeded to construct several small examples by utilizing a combination of randomized search and algebraic tools. We analyze the performance of these codes under several different local decoding procedures via Monte Carlo simulations. The decoders all share the property that they can be executed in parallel in O(1) time. Under the phenomenological noise model and including syndrome errors we obtain a threshold of ~5% which to our knowledge is the highest threshold among all local decoding schemes.
[1] A. Lubotzky, A. Guth, Journal Of Mathematical Physics 55, 082202 (2014).
[2] H. Bombin, Physical Review X 5 (3), 031043 (2015).
[3] M. Hastings, QIC 14, 1187 (2014).
[4] V. Londe, A. Leverrier, arXiv:1712.08578 (2017). -
Area law of non-critical ground states in 1D long-range interacting systems
Tomotaka Kuwahara RIKEN
The area law for entanglement provides one of the most important connections between information theory and quantum many-body physics. It is not only related to the universality of quantum phases, but also to efficient numerical simulations in the ground state (i.e., the lowest energy state). Various numerical observations have led to a strong belief that the area law is true for every non-critical phase in short-range interacting systems [1]. The so-called area-law conjecture states that the entanglement entropy is proportional to the surface region of subsystem if the ground state is non-critical (or gapped).
However, the area law for long-range interacting systems is still elusive as the long-range interaction results in correlation patterns similar to the ones in critical phases. Here, we show that for generic non-critical one-dimensional ground states, the area law robustly holds without any corrections even under long-range interactions [2]. Our result guarantees an efficient description of ground states by the matrix-product state in experimentally relevant long-range systems, which justifies the density-matrix renormalization algorithm. In the present talk, I will give an overview of the results, and show ideas of the proof if the time allows.
[1] J. Eisert, M. Cramer, and M. B. Plenio, ``Colloquium: Area laws for the entanglement entropy,'' Rev. Mod. Phys. 82, 277–306 (2010).
[2] T. Kuwahara and K. Saito, ``Area law of non-critical ground states in 1d long-range interacting systems,'' arXiv preprint arXiv:1908.11547 (2019),
-
Fundamental Constraints for Fundamental Theories
Rachel Rosen Carnegie Mellon University
As our understanding of the universe and its fundamental building blocks extends to shorter and shorter distances, experiments capable of probing these scales are becoming increasingly difficult to construct. Fundamental particle physics faces a potential crisis: an absence of data at the shortest possible scales. Yet remarkably, even in the absence of experimental data, the requirement of theoretical consistency puts stringent constraints on viable models of fundamental particles and their interactions. In this talk I’ll discuss a variety of criteria that constrain theories of particles in flat spacetime and de Sitter. Such criteria have the possibility to address questions such as: What low energy theories admit consistent UV completions? Which massive particles are allowed in an interacting theory? Is string theory the unique weakly coupled UV completion of General Relativity?
-
PSI 2019/2020 - Computational Physics - Lecture 15
Erik Schnetter Perimeter Institute for Theoretical Physics
-
The Second Mission of Supersymmetry: Miracles in Uncharted Waters
Mikhail Shifman University of Minnesota
In our four-dimensional world supersymmetry is the only extension of the classical Poincaré invariance which laid the foundation of modern physics in the beginning of the 20th century. Supersymmetry, a new geometric symmetry extending Poincaré, was discovered in 1970 –– it was overlooked for decades because of its quantum nature. In the next 10 years or so supersymmetry
assumed the role of a universal framework in which new models for natural phenomena and regularities (e.g. the concept of naturalness) have been developed. It gave rise to a powerful stream of theoretical phenomenology.
The fact that LHC at CERN produced no evidence for low-energy supersymmetry (and naturalness as well) was a powerful blow. However, despite its absence in experiments the less known second mission of supersymmetry is highly successful, with remarkable advances occurring on a regular basis. Supersymmetry proved its power and uniqueness for those who address hard questions in strongly coupled field theories, including Yang-Mills. Some supersymmetry-based exact results obtained in four dimensions are the main topics of my talk. In the past one could hardly dream that such results are possible.
-
PSI 2019/2020 - Computational Physics - Lecture 14
Erik Schnetter Perimeter Institute for Theoretical Physics
-
Entanglement entropy of highly excited eigenstates of many-body lattice Hamiltonians
Marcos Rigol Pennsylvania State University
The average entanglement entropy of subsystems of random pure states is (nearly) maximal. In this talk, we discuss the average entanglement entropy of subsystems of highly excited eigenstates of integrable and nonintegrable many-body lattice Hamiltonians. For translationally invariant quadratic models (or spin models mappable to them) we prove that, when the subsystem size is not a vanishing fraction of the entire system, the average eigenstate entanglement entropy exhibits a leading volume-law term that is different from that of random pure states. We argue that such a leading term is likely universal for translationally invariant (noninteracting and interacting) integrable models. For random pure states with a fixed particle number (random canonical states) away from half filling and normally distributed real coefficients, we prove that the deviation from the maximal value grows with the square root of the system's volume when the size of the subsystem is one half of that of the system. We then show that the average entanglement entropy of highly excited eigenstates of a particle number conserving quantum chaotic model is the same as that of random canonical states.