Search results in Quantum Physics from PIRSA
Format results
-
-
Experimental Quantum Error Correction
Raymond Laflamme Institute for Quantum Computing (IQC)
-
Hamiltonian Quantum Cellular Automata in 1D
Pawel Wocjan University of Central Florida
-
Entanglement Renormalization, Quantum Criticality and Topological Order
Guifre Vidal Alphabet (United States)
-
How Difficult is Quantum Many-Body Theory?
Matt Hastings Los Alamos National Laboratory
-
Distinguishability of Quantum Operations
Zhengfeng Ji University of Technology Sydney
-
Charting the Shape of Hilbert Space: A Bit of Quantum Foundations at PI
Chris Fuchs University of Massachusetts Boston
-
Backward causation models for quantum correlations
Steve Weinstein University of Waterloo
-
Relating Entanglement to Quantum Communication
Jonathan Oppenheim University College London
-
Timeless Questions in the Decoherent Histories Approach to Quantum Theory
Petros Wallden University of Athens
-
Quantum, classical & coarse-grained measurements
Johannes Kofler University of Vienna
-
Can Classical Description of Physical Reality Be Considered Complete?
Gabriel Catren Centre de Recherche en Epistémologie Appliquée
-
Playing the quantum harp: from quantum metrology to quantum computing with harmonic oscillators
Olivier Pfister University of Virginia
The \\\"frequency comb\\\" defined by the eigenmodes of an optical resonator is a naturally large set of exquisitely well defined quantum systems, such as in the broadband mode-locked lasers which have redefined time/frequency metrology and ultra precise measurements in recent years. High coherence can therefore be expected in the quantum version of the frequency comb, in which nonlinear interactions couple different cavity modes, as can be modeled by different forms of graph states. We show that is possible to thereby generate states of interest to quantum metrology and computing, such as multipartite entangled cluster and Greenberger-Horne-Zeilinger states. -
Experimental Quantum Error Correction
Raymond Laflamme Institute for Quantum Computing (IQC)
The Achilles\\\' heel of quantum information processors is the fragility of quantum states and processes. Without a method to control imperfection and imprecision of quantum devices, the probability that a quantum computation succeed will decrease exponentially in the number of gates it requires. In the last ten years, building on the discovery of quantum error correction, accuracy threshold theorems were proved showing that error can be controlled using a reasonable amount of resources as long as the error rate is smaller than a certain threshold. We thus have a scalable theory describing how to control quantum systems. I will briefly review some of the assumptions of the accuracy threshold theorems and comment on experiments that have been done and should be done to turn quantum error correction into an experimental reality. -
Hamiltonian Quantum Cellular Automata in 1D
Pawel Wocjan University of Central Florida
We construct a simple translationally invariant, nearest-neighbor Hamiltonian on a chain of 10-dimensional qudits that makes it possible to realize universal quantum computing without any external control during the computational process, requiring only initial product state preparation. Both the quantum circuit and its input are encoded in an initial canonical basis state of the qudit chain. The computational process is then carried out by the autonomous Hamiltonian time evolution. After a time greater than a polynomial in the size of the quantum circuit has passed, the result of the computation can be obtained with high probability by measuring a few qudits in the computational basis. This result also implies that there cannot exist efficient classical simulation methods for generic translationally invariant nearest-neighbor Hamiltonians on qudit chains, unless quantum computers can be efficiently simulated by classical computers (or, put in complexity theoretic terms, unless BPP=BQP). This is joint work with Daniel Nagaj. -
Entanglement Renormalization, Quantum Criticality and Topological Order
Guifre Vidal Alphabet (United States)
The renormalization group (RG) is one of the conceptual pillars of statistical mechanics and quantum field theory, and a key theoretical element in the modern formulation of critical phenomena and phase transitions. RG transformations are also the basis of numerical approaches to the study of low energy properties and emergent phenomena in quantum many-body systems. In this colloquium I will introduce the notion of \\\"entanglement renormalization\\\" and use it to define a coarse-graining transformation for quantum systems on a lattice [G.Vidal, Phys. Rev. Lett. 99, 220405 (2007)]. The resulting real-space RG approach is able to numerically address 1D and 2D lattice systems with thousands of quantum spins using only very modest computational resources. From the theoretical point of view, entanglement renormalization sheds new light into the structure of correlations in the ground state of extended quantum systems. I will discuss how it leads to a novel, efficient representation for the ground state of a system at a quantum critical point or with topological order. -
How Difficult is Quantum Many-Body Theory?
Matt Hastings Los Alamos National Laboratory
The basic problem of much of condensed matter and high energy physics, as well as quantum chemistry, is to find the ground state properties of some Hamiltonian. Many algorithms have been invented to deal with this problem, each with different strengths and limitations. Ideas such as entanglement entropy from quantum information theory and quantum computing enable us to understand the difficulty of various problems. I will discuss recent results on area laws and use these to prove that we can use matrix product states to efficiently represent ground states for one-dimensional systems with a spectral gap, while certain other one-dimensional problems, without the gap assumption, almost certainly have no efficient way for us to even represent the ground state on a classical computer. I will also discuss recent results on higher-dimensional matrix product states, in an attempt to extend the remarkable success of matrix product algorithms beyond one dimension. -
Distinguishability of Quantum Operations
Zhengfeng Ji University of Technology Sydney
In this talk, we will investigate the distinguishability of quantum operations from both discrete and continuous point of view. In the discrete case, the main topic is how we can identify quantum measurement apparatuses by considering the patterns of measurement outcomes. In the continuous case, we will focus on the efficiency of parameter estimation of quantum operations. We will discuss several methods that can achieve Heisenberg Limit and prove in some other cases the impossibility of breaking the Standard Quantum Limit. The general treatment of estimation of quantum operations also allows an investigation of the effect of noise on estimation efficiency. -
Charting the Shape of Hilbert Space: A Bit of Quantum Foundations at PI
Chris Fuchs University of Massachusetts Boston
As physicists, we have become accustomed to the idea that a theory\\\'s content is always most transparent when written in coordinate-free language. But sometimes the choice of a good coordinate system is very useful for settling deep conceptual issues. Think of how Eddington-Finkelstein coordinates settled the longstanding question of whether the event horizon of a Schwarzschild black hole corresponds to a real spacetime singularity or not. Similarly we believe for an information-oriented or Bayesian approach to quantum foundations: That one good coordinate system may (eventually!) be worth more than a hundred blue-in-the-face arguments. This talk will motivate and chronicle the search for one such candidate coordinate system---the so-called Symmetric Informationally Complete Measurement---which has caught the attention of a handful of us here at PI and a handful of our visitors. -
Backward causation models for quantum correlations
Steve Weinstein University of Waterloo
Bell\\\'s theorem is commonly understood to show that EPR correlations are not explainable via a local hidden variable theory. But Bell\\\'s theorem assumes that the initial state of the particles is independent of the final detector settings. It has been proposed that this independence assumption might be undermined by a relativistically-allowed form of \\\"backward causation\\\", thereby allowing construction of a local hidden-variable model after all. In this talk, I will show that there is no backward causation model which yields the desired correlations. However, there are other physical scenarios yielding nontrivial nonlocal correlations which violated Bell\\\'s independence assumption. I will present two. -
Relating Entanglement to Quantum Communication
Jonathan Oppenheim University College London
Roughly speaking, the more Alice is entangled with Bob, the harder it is for her to send her state to Charlie. In particular, it will be shown that the squashed entanglement, a well known entanglement measure, gives the fastest rate at which a quantum state can be sent between two parties who share arbitrary side information. Likewise, the entanglement of formation and entanglement cost is shown to be the fastest rate at which a quantum state can be sent when the parties have access to side-information which is maximally correlated. A further restriction on the type of side-information implies that the rate of state transmission is given by the quantum mutual information. This suggests a new paradigm for understanding entanglement and other correlations in terms of quantum Shannon theroy. Different types of side-information correspond to different types of correlations with the squashed entanglement and the mutual information being two extremes. Furthermore, there is a dual paradigm: if one distributes the side-information as aliciously as possible so as to make the sending of the state as difficult as possible, one finds maximum rates which give interpretations to known quantities as well as new ones. -
Timeless Questions in the Decoherent Histories Approach to Quantum Theory
Petros Wallden University of Athens
In any attempt to construct a Quantum Theory of Gravity, one has to deal with the fact that Time in Quantum Mechanics appears to be very different from Time in General Relativity. This is the famous (or actually notorious!) \"Problem of Time\", and gives rise to both conceptual and technical problems. The decoherent histories approach to quantum theory, is an alternative formulation of quantum theory specially designed to deal with closed (no-external observer or environment) systems. This approach has been considered particularly promising, in dealing with the problem of time, since it puts space and time in equal footing (unlike standard QM) . This talk develops a particular implementation of the above expectations, i.e. we construct a general set of \"Class Operators\" corresponding to questions that appear to be \"Timeless\" (independent of the parameter time), but correspond to physically interesting questions. This is similar to finding a general enough set of timeless observables, in the evolving constants approach to the problem of time. -
Quantum, classical & coarse-grained measurements
Johannes Kofler University of Vienna
The descriptions of the quantum realm and the macroscopic classical world differ significantly not only in their mathematical formulations but also in their foundational concepts and philosophical consequences. When and how physical systems stop to behave quantumly and begin to behave classically is still heavily debated in the physics community and subject to theoretical and experimental research. Conceptually different from already existing models, we have developed a novel theoretical approach to understand this transition from the quantum to a macrorealistic world. It neither needs to refer to the environment of a system (decoherence) nor to change the quantum laws itself (collapse models) but puts the stress on the limits of observability of quantum phenomena due to our measurement apparatuses. First, we demonstrated that for unrestricted measurement accuracy a systems time evolution cannot be described classically, not even if it is arbitrarily large and macroscopic. Under realistic conditions in every-day life, however, we are only able to perform coarse-grained measurements and do not resolve individual quantum levels of the macroscopic system. As we could show, it is this mere restriction to fuzzy measurements which is sufficient to see the natural emergence of macroscopic realism and even the classical Newtonian laws out of the full quantum laws: the systems time evolution governed by the Schrödinger equation and the state projection induced by measurements. This resolves the apparent impossibility of how classical realism and deterministic laws can emerge out of fundamentally random quantum events. We find the sufficient condition for these classical evolutions for isolated systems under coarse-grained measurements. Then we demonstrate that nevertheless there exist non-classical Hamiltonians which are in conflict with macroscopic realism. Thus, though at every instant of time the quantum state appears as a classical mixture, its time evolution cannot be understood classically. We argue why such Hamiltonians are unlikely to be realized in nature. -
Can Classical Description of Physical Reality Be Considered Complete?
Gabriel Catren Centre de Recherche en Epistémologie Appliquée
A conceptual framework is proposed for understanding the relationship between observables and operators in mechanics. We claim that the transformations generated by the objective properties of a physical system must be strictly interpreted as gauge transformations. It will be shown that this postulate cannot be consistently implemented in the framework of classical mechanics. We argue that the uncertainty principle is a consequence of the mutual intertwining between objective properties and gauge-dependant properties. Hence, in classical mechanics gauge-dependant properties are wrongly considered objective. It follows that the quantum description of objective physical states is not incomplete, but rather that the classical notion is overdetermined.