Search results in Quantum Physics from PIRSA
Format results
-
-
Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
-
-
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
-
Quantum Adversary (Upper) Bound
Shelby Kimmel Massachusetts Institute of Technology (MIT)
-
Light and matter: towards macroscopic quantum systems
Jacob Taylor Office of Science and Technology Policy
-
Bulk-boundary correspondence in PEPS
Ignacio Cirac Max Planck Institute for Gravitational Physics - Albert Einstein Institute (AEI)
-
Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
-
A universal Hamiltonian simulator: the full characterization
Gemma De Las Cuevas Universität Innsbruck
-
-
Asymptotically Optimal Topological Quantum Compiling
Vadym Kliuchnikov University of Waterloo
-
Simulating Hamiltonian evolution with quantum computers
Richard Cleve Institute for Quantum Computing (IQC)
-
Direct Detection of Classically Undetectable Dark Matter through Quantum Decoherence
Jess Riedel NTT Research
Although various pieces of indirect evidence about the nature of dark matter have been collected, its direct detection has eluded experimental searches despite extensive effort. If the mass of dark matter is below 1 MeV, it is essentially imperceptible to conventional detection methods because negligible energy is transferred to nuclei during collisions. Here I propose directly detecting dark matter through the quantum decoherence it causes rather than its classical effects such as recoil or ionization. I show that quantum spatial superpositions are sensitive to low-mass dark matter that is inaccessible to classical techniques. This provides new independent motivation for matter interferometry with large masses, especially on spaceborne platforms. The apparent dark matter wind we experience as the Sun travels through the Milky Way ensures interferometers and related devices are directional detectors, and so are able to provide unmistakable evidence that decoherence has galactic origins. -
Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
On the face of it, quantum physics is nothing like classical physics. Despite its oddity, work in the foundations of quantum theory has provided some palatable ways of understanding this strange quantum realm. Most of our best theories take that story to include the existence of a very non-classical entity: the wave function. Here I offer an alternative which combines elements of Bohmian mechanics and the many-worlds interpretation to form a theory in which there is no wave function. According to this theory, all there is at the fundamental level are particles interacting via Newtonian forces. In this sense, the theory is classical. However, it is still undeniably strange as it posits the existence of many worlds. Unlike the many worlds of the many-worlds interpretation, these worlds are fundamental, not emergent, and are interacting, not causally isolated. The theory will be presented as a fusion of the many-worlds interpretation and Bohmian mechanics, but can also be seen as a foundationally clear version of quantum hydrodynamics. A key strength of this theory is that it provides a simple and compelling story about the connection between the amplitude-squared of the wave function and probability. The theory also gives a natural explanation of the way the wave function transforms under time reversal and Galilean boosts. -
Homological Product Codes
Sergey Bravyi IBM (United States)
Quantum codes with low-weight stabilizers known as LDPC codes have been actively studied recently due to their potential applications in fault-tolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. Here we propose the first family of good quantum codes with low-weight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most square root of n qubits, where n is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. Our proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight n^a for any a>0. Finally, we apply the homological product to construct new small codes with low-weight stabilizers. This is a joint work with Matthew Hastings. -
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories. We use such marginal-involving quantum bounds to estimate the volume of all possible quantum statistics in the complete 8-dimensional probability space of the Bell-CHSH scenario, measured relative to both local hidden variable models as well as general no-signaling theories. See arXiv:1106.2169. Time permitting, we’ll also discuss how one might go about trying to prove that a given mixed state is, in fact, not entangled. (The converse problem of certifying non-zero entanglement has received extensive treatment already.) Instead of directly asking if any separable representation exists for the state, we suggest simply checking to see if it “fits” some particular known-separable form. We demonstrate how a surprisingly valuable sufficient separability criterion follows merely from considering a highly-generic separable form. The criterion we generate for diagonally-symmetric mixed states is apparently completely tight, necessary and sufficient. We use integration to quantify the “volume” of states captured by our criterion, and show that it is as large as the volume of states associated with the PPT criterion; this simultaneously proves our criterion to be necessary as well as the PPT criterion to be sufficient, on this family of states. The utility of a sufficient separability criterion is evidenced by categorically rejecting Dicke-model superradiance for entanglement generation schema. See arXiv:1307.5779. -
Quantum Adversary (Upper) Bound
Shelby Kimmel Massachusetts Institute of Technology (MIT)
I discuss a technique - the quantum adversary upper bound - that uses the structure of quantum algorithms to gain insight into the quantum query complexity of Boolean functions. Using this bound, I show that there must exist an algorithm for a certain Boolean formula that uses a constant number of queries. Since the method is non-constructive, it does not give information about the form of the algorithm. After describing the technique and applying it to a class of functions, I will outline quantum algorithms that match the non-constructive bound. -
Light and matter: towards macroscopic quantum systems
Jacob Taylor Office of Science and Technology Policy
Advances in quantum engineering and material science are enabling new approaches for building systems that behave quantum mechanically on long time scales and large length scales. I will discuss how microwave and optical technologies in particular are leading to new domains of many-body physics, both classical and quantum, using photons and phonons as the constituent particles. Furthermore, I will highlight practical consequences of these advances, including improved force and acceleration sensing, efficient signal transduction, and topologically robust photonic circuits. Finally, I will consider how such large quantum systems may help us measure and constrain theories of quantum gravity and gravity-induced decoherence. -
Bulk-boundary correspondence in PEPS
Ignacio Cirac Max Planck Institute for Gravitational Physics - Albert Einstein Institute (AEI)
TBA -
Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
The standard formulation of quantum mechanics is operationally asymmetric with respect to time reversal---in the language of compositions of tests, tests in the past can influence the outcomes of test in the future but not the other way around. The question of whether this represents a fundamental asymmetry or it is an artifact of the formulation is not a new one, but even though various arguments in favor of an inherent symmetry have been made, no complete time-symmetric formulation expressed in rigorous operational terms has been proposed. Here, we discuss such a possible formulation based on a generalization of the usual notion of test. We propose to regard as a test any set of events between an input and an output system which can be obtained by an autonomously defined laboratory procedure. This includes standard tests, as well as proper subsets of the complete set of outcomes of standard tests, whose realization may require post-selection in addition to pre-selection. In this approach, tests are not expected to be operations that are up to the choices of agents---the theory simply says what circuits of tests may occur and what the probabilities for their outcomes would be, given that they occur. By virtue of the definition of test, the probabilities for the outcomes of past tests can depend on tests that take place in the future. Such theories have been previously called non-causal, but here we revisit that notion of causality. Using the Choi-Jamiolkowski isomorphism, every test in that formulation, commonly regarded as inducing transformations from an input to an output system, becomes equivalent to a passive detection measurement applied jointly on two input systems---one from the past and one from the future. This is closely related to the two-state vector formalism, but it comes with a conceptual revision: every measurement is a joint measurement on two separate systems and not on one system described by states in the usual Hilbert space and its dual. We thus obtain a static picture of quantum mechanics in space-time or more general structures, in which every experiment is a local measurement on a global quantum state that generalizes the recently proposed quantum process matrix. The existence of two types of systems in the proposed formalism allows us to define causation in terms of correlations without invoking the idea of intervention, offering a possible answer to the problem of the meaning of causation. The framework is naturally compatible with closed time-like curves and other exotic causal structures. -
A universal Hamiltonian simulator: the full characterization
Gemma De Las Cuevas Universität Innsbruck
We show that if the ground state energy problem of a classical spin model is NP-hard, then there exists a choice parameters of the model such that its low energy spectrum coincides with the spectrum of \emph{any} other model, and, furthermore, the corresponding eigenstates match on a subset of its spins. This implies that all spin physics, for example all possible universality classes, arise in a single model. The latter property was recently introduced and called ``Hamiltonian completeness'', and it was shown that several different models had this property. We thus show that Hamiltonian completeness is essentially equivalent to the more familiar complexity-theoretic notion of NP-completeness. Additionally, we also show that Hamiltonian completeness implies that the partition functions are the same. These results allow us to prove that the 2D Ising model with fields is Hamiltonian complete, which is substantially simpler than the previous examples of complete Hamiltonians. Joint work with Toby Cubitt. -
A note on emergent classical behavior and approximations to decoherence functionals
Henrique Gomes University of Oxford
Although it can only be argued to have become consequential in the study of quantum cosmology, the question ``Why do we observe a classical world? " has been one of the biggest preoccupations of quantum foundations. In the consistent histories formalism, the question is shifted to an analysis of the telltale sign of quantum mechanics: superposition of states. In the consistent histories formalism, histories of the system which ``decohere", i.e. fall out of superposition or have negligible interference can be subjected to a notion of classical probability. In this paper we use an extension of Kirchoff's diffraction formula for wave functions on configuration spaces to give a different analysis and an approximation of decoherence. The Kirchoff diffraction formula lies conveniently at the midway between path integrals, wave equations, and classical behavior. By using it, we formulate an approximate dampening of the amplitude of superposition of histories. The dampening acts on each middle element of the fine-grained history {c_\alpha}, and is a function of the angle formed between {c_{n-1},c_n} and {c_n,c_{n+1}}, as classical trajectories in configuration space. As an example we apply the formalism to a modified gravity theory in the ADM gravitational conformal superspace. -
Asymptotically Optimal Topological Quantum Compiling
Vadym Kliuchnikov University of Waterloo
In a topological quantum computer, universality is achieved by braiding and quantum information is natively protected from small local errors. We address the problem of compiling single-qubit quantum operations into braid representations for non-abelian quasiparticles described by the Fibonacci anyon model. We develop a probabilistically polynomial algorithm that outputs a braid pattern to approximate a given single-qubit unitary to a desired precision. We also classify the single-qubit unitaries that can be implemented exactly by a Fibonacci anyon braid pattern and present an efficient algorithm to produce their braid patterns. Our techniques produce braid patterns that meet the uniform asymptotic lower bound on the compiled circuit depth and thus are depth-optimal asymptotically. Our compiled circuits are significantly shorter than those output by prior state-of-the-art methods, resulting in improvements in depth by factors ranging from 20 to 1000 for precisions ranging between 10^{−10} and 10^{−30}. -
Simulating Hamiltonian evolution with quantum computers
Richard Cleve Institute for Quantum Computing (IQC)
In 1982, Richard Feynman proposed the concept of a quantum computer as a means of simulating physical systems that evolve according to the Schrödinger equation. I will explain various quantum algorithms that have been proposed for this simulation problem, including my recent work (jointly with Dominic Berry and Rolando Somma) that significantly improves the running time as a function of the precision of the output data.