Search results in Quantum Physics from PIRSA
Format results
-
-
Spectral graph theory applied to simulating stoquastic adiabatic optimization
Michael Jarret George Mason University
-
An axiomatic avenue to AdS/CFT
Cédric Bény Leibniz University Hannover
-
Error-correction in non-abelian anyon models
Courtney Brell University College London
-
-
-
Fault-tolerant error correction with the gauge color code
Benjamin Brown University of Sydney
-
-
C*-algebras as cosheaves with self-action
Cecilia Flori Libera Università Internazionale degli Studi Sociali Guido Carli
-
Entanglement can be made robust [Joint work with Aram Harrow]
Lior Eldar Massachusetts Institute of Technology (MIT)
-
Characterizing the coherence of errors
Joel Wallman Institute for Quantum Computing (IQC)
-
-
Quantum Clocks
Time in quantum mechanics has duly received a lot of attention over the years. Perfect clocks which can turn on/off a particular interaction at a precise time that have been proposed only exist in infinite dimensions and have unphysical Hamiltonians (their spectrum is unbounded from below). It was this observation which led many to conclude that an operator for time cannot exist in quantum mechanics. Here, we prove rigorous results about the accuracy of finite dimensional clocks and show that they can well approximate their infinite dimensional counterparts under the right conditions.
-
Spectral graph theory applied to simulating stoquastic adiabatic optimization
Michael Jarret George Mason University
Quantum adiabatic optimization (QAO) slowly varies an initial Hamiltonian with an easy-to-prepare ground-state to a final Hamiltonian whose ground-state encodes the solution to some optimization problem. Currently, little is known about the performance of QAO relative to classical optimization algorithms as we still lack strong analytic tools for analyzing its performance. In this talk, I will unify the problem of bounding the runtime of one such class of Hamiltonians -- so-called stoquastic Hamiltonians -- with questions about functions on graphs, heat diffusion, and classical sub-stochastic processes. I will introduce new tools for bounding the spectral gap of stoquastic Hamiltonians and, by exploiting heat diffusion, show that one of these techniques also provides an optimal and previously unknown gap bound for particular classes of graphs. Using this intuition and combining heat diffusion with classical sub-stochastic processes, I will offer a classical adiabatic algorithm that exhibits behavior typically considered "quantum", such as tunneling.
-
An axiomatic avenue to AdS/CFT
Cédric Bény Leibniz University Hannover
I will review a recent proposal for a top-down approach to AdS/CFT by A. Schwarz, which has the advantage of requiring few assumptions or extraneous knowledge, and may be of benefit to information theorists interested by the connections with tensor networks. I will also discuss ways to extend this approach from the Euclidean formalism to a real-time picture, and potential relationships with MERA.
-
Error-correction in non-abelian anyon models
Courtney Brell University College London
Scalable anyonic topological quantum computation requires the error-correction of non-abelian anyon systems. In contrast to abelian topological codes such as the toric code, the design, modelling, and simulation of error-correction protocols for non-abelian anyon codes is still in its infancy. Using a phenomenological noise model, we adapt abelian topological decoding protocols to the non-abelian setting and simulate their behaviour. We also show how to simulate error-correction in universal anyon models by exploiting the special structure of typical noise patterns.
-
Technology as Foundation, from the Quantum Informational Viewpoint: The Future (and some Past) of Quantum Theory after the Higgs Boson
Arkady Plotnitsky Purdue University
The talk first offers a brief assessment of the realist and nonrealist understanding of quantum theory, in relation to the role of probability and statistics there from the perspective of quantum information theory, in part in view of several recent developments in quantum information theory in the work of M. G. D’Ariano and L. Hardy, among others. It then argues that what defines quantum theory, both quantum mechanics and quantum field theory, most essentially, including as concerns realism or the lack thereof and the probability and statistics, is a new (vs. classical physics or relativity) role of technology in quantum physics. This role was first considered by Bohr in his analysis of the fundamental role of measuring instruments in the constitution of quantum phenomena, which, he argued, is responsible for the difficulties of providing a realist description of quantum objects and their behavior, and, correlatively, for the irreducibly probabilistic or statistical nature of all quantum predictions. In this paper, I mean “technology” in a broader sense, akin to what the ancient Greeks called “tekhne” (“technique”). It refers the means by which we create new mental and material constructions, such as mathematical, scientific, or philosophical theories or works of art and architecture, or machines, and through which we interact with the world. I shall consider three forms of technology—mathematical, experimental, and digital. The relationships among them were crucial to the discovery of the Higgs boson, and, I argue, are likely to remain equally crucial, indeed unavoidable, in the future of physics, especially quantum physics.
-
What the Reeh-Schielder theorem tells us about relativistic causality, or, Can experimenters in a lab on Earth create a Taj Mahal on the back of the moon?
Wayne Myrvold Western University
The Reeh-Schlieder theorem says, roughly, that, in any reasonable quantum field theory, for any bounded region of spacetime R, any state can be approximated arbitrarily closely by operating on the vacuum state (or any state of bounded energy) with operators formed by smearing polynomials in the field operators with functions having support in R. This strikes many as counterintuitive, and Reinhard Werner has glossed the theorem as saying that “By acting on the vacuum with suitable operations in a terrestrial laboratory, an experimenter can create the Taj Mahal on (or even behind) the Moon!” This talk has two parts. First, I hope to convince listeners that the theorem is not counterintuitive, and that it follows immediately from facts that are already familiar fare to anyone who has digested the opening chapters of any standard introductory textbook of QFT. In the second, I will discuss what we can learn from the theorem about how relativistic causality is implemented in quantum field theories.
-
Fault-tolerant error correction with the gauge color code
Benjamin Brown University of Sydney
The gauge color code is a quantum error-correcting code with local syndrome measurements that, remarkably, admits a universal transversal gate set without the need for resource-intensive magic state distillation. A result of recent interest, proposed by Bombin, shows that the subsystem structure of the gauge color code admits an error-correction protocol that achieves tolerance to noisy measurements without the need for repeated measurements, so called single-shot error correction. Here, we demonstrate the promise of single-shot error correction by designing a two-part decoder and investigate its performance. We simulate fault-tolerant error correction with the gauge color code by repeatedly applying our proposed error-correction protocol to deal with errors that occur continuously to the underlying physical qubits of the code over the duration that quantum information is stored. We estimate a sustainable error rate, i.e. the threshold for the long time limit, of ~0.31% for a phenomenological noise model using a simple decoding algorithm.
-
Learning quantum models for physical and non-physical data
In this talk I address the problem of simultaneously inferring unknown quantum states and unknown quantum measurements from empirical data. This task goes beyond state tomography because we are not assuming anything about the measurement devices. I am going to talk about the time and sample complexity of the inference of states and measurements, and I am going to talk about the robustness of the minimal Hilbert space dimension. Moreover, I will describe a simple heuristic algorithm (alternating optimization) to fit states and measurements to empirical data. For this algorithm the dataset does not need to be quantum. Hence, the proposed algorithm enables us to interpret general datasets from a quantum perspective. By analyzing movie ratings, we demonstrate the power of quantum models in the context of item recommendation which is a key discipline in machine learning. We observe that quantum models can compete with state-of-the-art algorithms for item recommendation. Based on joint work with Aram Harrow. Relevant preprints: arXiv:1412.7437 and arXiv:1510.02800.
-
C*-algebras as cosheaves with self-action
Cecilia Flori Libera Università Internazionale degli Studi Sociali Guido Carli
In this talk we will discuss how C*-algebras can be identified and characterised in terms of certain cosheaves with self-action. The reason we are interested in such a study is it to try and give a rigorous mathematical derivation of the axioms of quantum theory. In particular, many of the standard axioms for C*-algebras have unclear physical and operational meaning, but by defining an equivalence of categories between C*-algebras and cosheaves with self-action, we believe that these axioms can acquire a clear operational meaning. So far, however, we have only managed to show that every C*-algebra becomes a cosheaf with self-action, but we don't know whether the converse holds as well. This work is done in collaboration with Tobias Fritz.
-
Entanglement can be made robust [Joint work with Aram Harrow]
Lior Eldar Massachusetts Institute of Technology (MIT)
The accumulated intuition from the last decades of research on quantum entanglement is that this phenomenon is highly non-robust, and very hard to maintain in the presence of de-cohering noise at non-zero temperatures. In recent years however, and motivated, in part, by a quest for a quantum analog of the PCP theorem researches have tried to establish, at least in theory, whether or not we can preserve quantum entanglement at "constant" temperatures that are independent of system size. This would imply that any quantum state with energy at most, say 0.05 of the total available energy of the Hamiltonian, would be highly-entangled.
A conjecture formalizing this notion was defined by Freedman and Hastings : called NLTS - it stipulates the existence of locally-defined quantum systems that retain long-range entanglement even at high temperatures. Such a conjecture does not only present a necessary condition for quantum PCP, but also poses a fundamental question on the nature of entanglement itself. To this date, no such systems were found, and moreover, it became evident that even embedding local Hamiltonians on robust, albeit "non-physical" topologies, namely expanders, does not guarantee entanglement robustness.
In this study, refute the intuition that entanglement is inherently fragile: we show that locally-defined quantum systems can, in fact, retain long-range entanglement at high temperatures. To do this, we construct an explicit family of 7-local Hamiltonians, and prove that for such local Hamiltonians ANY low-energy state is hard to even approximately simulate by low-depth quantum circuits of depth o(log(n)). In particular, this resolves the NLTS conjecture in the affirmative, and suggests the existence of quantum systems whose low-energy states are not only highly-entangled but also "usefully"-entangled, in the computational-theoretic sense.
-
Characterizing the coherence of errors
Joel Wallman Institute for Quantum Computing (IQC)
I will introduce the unitarity, a parameter quantifying the coherence of a channel and show that it is useful for two reasons. First, it can be efficiently estimated via a variant of randomized benchmarking. Second, it captures useful information about the channel, such as the optimal fidelity achievable with unitary corrections and an improved bound on the diamond distance.
-
Self-guided quantum systems
Chris Ferrie University of Waterloo
I’ll present new approaches to the problems of quantum control and quantum tomography wherein no classical simulation is required. The experiment itself performs the simulation (in situ) and, in a sense, guides itself to the correct solution. The algorithm is iterative and makes use of ideas from stochastic optimization theory.