Format results
-
-
-
Reinforcement Learning assisted Quantum Optimization
Matteo Wauters SISSA International School for Advanced Studies
-
-
-
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
-
Ultracold Molecules: From Quantum Chemistry to Quantum Computing
Alan Jamison Institute for Quantum Computing (IQC)
-
Toward a Quantum-Safe Future
Michele Mosca Institute for Quantum Computing (IQC)
-
The Emperor's New Crown: What Covid-19 Reveals
Brandon Ogbunu Brown University
-
-
-
Fundamental Constraints for Fundamental Theories
Rachel Rosen Carnegie Mellon University
-
Opportunities and challenges in precision physics at the LHC
Lorenzo TancrediAfter the discovery of the Higgs boson in 2012, the Large Hadron Collider (LHC) at CERN has turned from a discovery machine to a precision machine. The highly boosted events measured by the LHC experiments are, for the first time, providing us a window on the details of the electroweak symmetry breaking mechanism. A crucial condition to maximise the reach of these studies is a profound understanding of the theoretical implications of perturbative Quantum Field Theory, and in particular of Quantum ChromoDynamics (QCD), for the physics of hadronic collisions at the LHC. In this talk, I will provide an account of the opportunities and the challenges that precision physics at the LHC can offer, focusing in particular on the recent developments in our understanding of higher order calculations in perturbative Quantum Field Theory and how they can help us understand the Higgs sector of the Standard Model.
-
Probing GRB physics through high-energy observations with Fermi
Elisabetta BissaldiThe Fermi Gamma-ray Space Telescope has provided unique insights into the Universe's biggest explosions over the past 12 years. With thousands of gamma-ray bursts (GRBs) detected by the Gamma-ray Burst Monitor (GBM) and hundreds by the Large Area Telescope (LAT), we have learned about the broad properties of the populations of these events and got unique insights into their emission mechanisms, environment, and physical properties. In this seminar, I'll review highlights of GRB science from the Fermi mission at low (keV) and high (GeV) energy, as well as the recent discovery of very-high (TeV) energy emission from GRB 180720B and GRB 140114C observed by the Cherenkov Telescopes of the H.E.S.S. and MAGIC experiments, respectively
-
Reinforcement Learning assisted Quantum Optimization
Matteo Wauters SISSA International School for Advanced Studies
We propose a reinforcement learning (RL) scheme for feedback quantum control within the quantum approximate optimization algorithm (QAOA). QAOA requires a variational minimization for states constructed by applying a sequence of unitary operators, depending on parameters living in a highly dimensional space. We reformulate such a minimum search as a learning task, where a RL agent chooses the control parameters for the unitaries, given partial information on the system. We show that our RL scheme learns a policy converging to the optimal adiabatic solution for QAOA found by Mbeng et al. arXiv:1906.08948 for the translationally invariant quantum Ising chain. In presence of disorder, we show that our RL scheme allows the training part to be performed on small samples, and transferred successfully on larger systems. Finally, we discuss QAOA on the p-spsin model and how its robustness is enhanced by reinforce learning. Despite the possibility of finding the ground state with polynomial resources even in the presence of a first order phase transition, local optimizations in the p-spsin model suffer from the presence of many minima in the energy landscape. RL helps to find regular solutions that can be generalized to larger systems and make the optimization less sensitive to noise.
References
-
Quantum homeopathy works: Efficient unitary designs with a system-size independent number of non-Clifford gates
Ingo Roth Freie Universität Berlin
Many quantum information protocols require the implementation of random unitaries. Because it takes exponential resources to produce Haar-random unitaries drawn from the full n-qubit group, one often resorts to t-designs. Unitary t-designs mimic the Haar-measure up to t-th moments. It is known that Clifford operations can implement at most 3-designs. In this work, we quantify the non-Clifford resources required to break this barrier. We find that it suffices to inject O(t^4 log^2(t) log(1/ε)) many non-Clifford gates into a polynomial-depth random Clifford circuit to obtain an ε-approximate t-design. Strikingly, the number of non-Clifford gates required is independent of the system size – asymptotically, the density of non-Clifford gates is allowed to tend to zero. We also derive novel bounds on the convergence time of random Clifford circuits to the t-th moment of the uniform distribution on the Clifford group. Our proofs exploit a recently developed variant of Schur-Weyl duality for the Clifford group, as well as bounds on restricted spectral gaps of averaging operators. Joint work with J. Haferkamp, F. Montealegre-Mora, M. Heinrich, J. Eisert, and D. Gross.
-
Cayley path and quantum supremacy: Average case #P-Hardness of random circuit sampling
Ramis Movassagh MIT-IBM Watson AI Lab
Given the large push by academia and industry (e.g., IBM and Google), quantum computers with hundred(s) of qubits are at the brink of existence with the promise of outperforming any classical computer. Demonstration of computational advantages of noisy near-term quantum computers over classical computers is an imperative near-term goal. The foremost candidate task for showing this is Random Circuit Sampling (RCS), which is the task of sampling from the output distribution of a random circuit. This is exactly the task that recently Google experimentally performed on 53-qubits.
Stockmeyer's theorem implies that efficient sampling allows for estimation of probability amplitudes. Therefore, hardness of probability estimation implies hardness of sampling. We prove that estimating probabilities to within small errors is #P-hard on average (i.e. for random circuits), and put the results in the context of previous works.
Some ingredients that are developed to make this proof possible are construction of the Cayley path as a rational function valued unitary path that interpolate between two arbitrary unitaries, an extension of Berlekamp-Welch algorithm that efficiently and exactly interpolates rational functions, and construction of probability distributions over unitaries that are arbitrarily close to the Haar measure. -
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
Basic statistical properties of quantum many-body systems in thermal equilibrium can be obtained from their partition function. In this talk, I will present a quasi-polynomial time classical algorithm that estimates the partition function of quantum many-body systems at temperatures above the thermal phase transition point. It is known that in the worst case, the same problem is NP-hard below this temperature. This shows that the transition in the phase of a quantum system is also accompanied by a transition in the computational hardness of estimating its statistical properties. The key to this result is a characterization of the phase transition and the critical behavior of the system in terms of the complex zeros of the partition function. I will also discuss the relation between these complex zeros and another signature of the thermal phase transition, namely, the exponential decay of correlations. I will show that in a system of n particles above the phase transition point, where the complex zeros are far from the real axis, the correlation between two observables whose distance is at least log(n) decays exponentially. This is based on joint work with Aram Harrow and Saeed Mehraban.
-
Ultracold Molecules: From Quantum Chemistry to Quantum Computing
Alan Jamison Institute for Quantum Computing (IQC)
Cooling atomic gases to ultracold temperatures revolutionized the field of atomic physics, connecting with and impacting many other areas in physics. Advances in producing ultracold molecules suggest similarly dramatic discoveries are on the horizon. First, I will review the physics of ultracold molecules, including our work bringing a new class of molecules to nanokelvin temperatures. Chemistry at these temperatures has a very different character than at room temperature. One striking effect is our recent result using spin states of reactants to control chemical reaction pathways. I will also describe how the strong electric dipole moments of ultracold molecules present an exciting new tool for quantum information and quantum computing.
-
Toward a Quantum-Safe Future
Michele Mosca Institute for Quantum Computing (IQC)
There has been tremendous progress in the many layers needed to realize large-scale quantum computing, from the hardware layers to the high level software. There has also been vastly increased exploration into the potentially useful applications of quantum computers, which will drive the desire to build quantum computers and make them available to users. I will describe some of my research in quantum algorithmics and quantum compiling.
The knowledge and tools developed for these positive applications give us insight into the cost of implementing quantum cryptanalysis of today's cryptographic algorithms, which is a key factor in estimating when quantum computers will be cryptographically relevant (the "collapse time"). In addition to my own estimates, I will summarize the estimates of 22 other thought leaders in quantum computing.
What quantum cryptanalysis means to an organization or a sector depends not only on the collapse time, but also on the time to migrate to quantum-safe algorithms as well as the shelf-life of information assets being protected. In recent years, we have gained increasing insight into the challenges of a wide-scale migration of existing systems. We must also be proactive as we deploy new systems. Open-source platforms, like OpenQuantumSafe and OpenQKDNetwork, are valuable resources in helping meet many of these challenges.
While awareness of the challenges and the path forward has increased immensely, there is still a long road ahead as we work together with additional stakeholders not only to prepare our digital economy to be resilient to quantum attacks, but also to make us more resilient to other threats that emerge. -
The Emperor's New Crown: What Covid-19 Reveals
Brandon Ogbunu Brown University
As of late March 2020, Covid-19 has already secured its status among the most expansive pandemics of the last century. Covid-19 is caused by a coronavirus--SARS-CoV-2--that causes a severe respiratory disease in a fraction of those infected, and is typified by several important features: ability to infect cells of various kinds, contagiousness prior to the onset of symptoms, and a widely varying experience with disease across patient demographics.
In this seminar, I discuss the many lessons that the scientific community has learned from Covid-19, including insight from molecular evolution, cell biology, and epidemiology. I discuss the role of mathematical and computational modeling efforts in understanding the trajectory of the epidemic, and highlight modern findings and potential research questions at the interface of virology and materials science. I will also introduce areas of inquiry that might be of interest to the physics community.
-
Single-Shot-Decoding with High Thresholds in LDPC Quantum Codes with Constant Encoding Rate
Nikolas Breuckmann University College London
It is believed that active quantum error correction will be an essential ingredient to build a scalable quantum computer. The currently favored scheme is the surface code due to its high decoding threshold and efficient decoding algorithm. However, it suffers from large overheads which are even more severe when parity check measurements are subject to errors and have to be repeated. Furthermore, the number of encoded qubits in the surface code does not grow with system size, leading to a sub-optimal use of the physical qubits.
Finally, the decoding algorithm, while efficient, has non-trivial complexity and it is not clear whether it can be implemented in hardware that can keep up with the classical processing.
We present a class of low-density-parity check (LDPC) quantum codes which fix all three of the concerns mentioned above. They were first proposed in [1] and called 4D hyperbolic codes, as their definition is based on four-dimensional, curved geometries. They have the remarkable property that the number of encoded qubits grows linearly with system size, while their distance grows polynomially with system size, i.e. d~n a with 0.1 < a < 0.3. This is remarkable since it was previously conjectured that such codes could not exist [1]. Their structure allows for decoders which can deal with erroneous syndrome measurements, a property called single-shot error correction [2] as well as local decoding schemes [3].
Although [1] analyzed the encoding rate and distance of this code family abstractly, it is a non-trivial task to actually construct them. There is no known efficient deterministic procedure for obtaining small examples. Only single examples of reasonable size had been obtained previously [4]. These previous examples were part of different code families, so that it was not possible to determine a threshold. We succeeded to construct several small examples by utilizing a combination of randomized search and algebraic tools. We analyze the performance of these codes under several different local decoding procedures via Monte Carlo simulations. The decoders all share the property that they can be executed in parallel in O(1) time. Under the phenomenological noise model and including syndrome errors we obtain a threshold of ~5% which to our knowledge is the highest threshold among all local decoding schemes.
[1] A. Lubotzky, A. Guth, Journal Of Mathematical Physics 55, 082202 (2014).
[2] H. Bombin, Physical Review X 5 (3), 031043 (2015).
[3] M. Hastings, QIC 14, 1187 (2014).
[4] V. Londe, A. Leverrier, arXiv:1712.08578 (2017). -
Area law of non-critical ground states in 1D long-range interacting systems
Tomotaka Kuwahara RIKEN
The area law for entanglement provides one of the most important connections between information theory and quantum many-body physics. It is not only related to the universality of quantum phases, but also to efficient numerical simulations in the ground state (i.e., the lowest energy state). Various numerical observations have led to a strong belief that the area law is true for every non-critical phase in short-range interacting systems [1]. The so-called area-law conjecture states that the entanglement entropy is proportional to the surface region of subsystem if the ground state is non-critical (or gapped).
However, the area law for long-range interacting systems is still elusive as the long-range interaction results in correlation patterns similar to the ones in critical phases. Here, we show that for generic non-critical one-dimensional ground states, the area law robustly holds without any corrections even under long-range interactions [2]. Our result guarantees an efficient description of ground states by the matrix-product state in experimentally relevant long-range systems, which justifies the density-matrix renormalization algorithm. In the present talk, I will give an overview of the results, and show ideas of the proof if the time allows.
[1] J. Eisert, M. Cramer, and M. B. Plenio, ``Colloquium: Area laws for the entanglement entropy,'' Rev. Mod. Phys. 82, 277–306 (2010).
[2] T. Kuwahara and K. Saito, ``Area law of non-critical ground states in 1d long-range interacting systems,'' arXiv preprint arXiv:1908.11547 (2019),
-
Fundamental Constraints for Fundamental Theories
Rachel Rosen Carnegie Mellon University
As our understanding of the universe and its fundamental building blocks extends to shorter and shorter distances, experiments capable of probing these scales are becoming increasingly difficult to construct. Fundamental particle physics faces a potential crisis: an absence of data at the shortest possible scales. Yet remarkably, even in the absence of experimental data, the requirement of theoretical consistency puts stringent constraints on viable models of fundamental particles and their interactions. In this talk I’ll discuss a variety of criteria that constrain theories of particles in flat spacetime and de Sitter. Such criteria have the possibility to address questions such as: What low energy theories admit consistent UV completions? Which massive particles are allowed in an interacting theory? Is string theory the unique weakly coupled UV completion of General Relativity?