Format results
-
-
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
-
Quantum Adversary (Upper) Bound
Shelby Kimmel Massachusetts Institute of Technology (MIT)
-
Light and matter: towards macroscopic quantum systems
Jacob Taylor Office of Science and Technology Policy
-
Bulk-boundary correspondence in PEPS
Ignacio Cirac Max Planck Institute for Gravitational Physics - Albert Einstein Institute (AEI)
-
Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
-
A universal Hamiltonian simulator: the full characterization
Gemma De Las Cuevas Universität Innsbruck
-
-
Asymptotically Optimal Topological Quantum Compiling
Vadym Kliuchnikov University of Waterloo
-
Simulating Hamiltonian evolution with quantum computers
Richard Cleve Institute for Quantum Computing (IQC)
-
-
-
Homological Product Codes
Sergey Bravyi IBM (United States)
Quantum codes with low-weight stabilizers known as LDPC codes have been actively studied recently due to their potential applications in fault-tolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. Here we propose the first family of good quantum codes with low-weight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most square root of n qubits, where n is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. Our proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight n^a for any a>0. Finally, we apply the homological product to construct new small codes with low-weight stabilizers. This is a joint work with Matthew Hastings. -
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories. We use such marginal-involving quantum bounds to estimate the volume of all possible quantum statistics in the complete 8-dimensional probability space of the Bell-CHSH scenario, measured relative to both local hidden variable models as well as general no-signaling theories. See arXiv:1106.2169. Time permitting, we’ll also discuss how one might go about trying to prove that a given mixed state is, in fact, not entangled. (The converse problem of certifying non-zero entanglement has received extensive treatment already.) Instead of directly asking if any separable representation exists for the state, we suggest simply checking to see if it “fits” some particular known-separable form. We demonstrate how a surprisingly valuable sufficient separability criterion follows merely from considering a highly-generic separable form. The criterion we generate for diagonally-symmetric mixed states is apparently completely tight, necessary and sufficient. We use integration to quantify the “volume” of states captured by our criterion, and show that it is as large as the volume of states associated with the PPT criterion; this simultaneously proves our criterion to be necessary as well as the PPT criterion to be sufficient, on this family of states. The utility of a sufficient separability criterion is evidenced by categorically rejecting Dicke-model superradiance for entanglement generation schema. See arXiv:1307.5779. -
Quantum Adversary (Upper) Bound
Shelby Kimmel Massachusetts Institute of Technology (MIT)
I discuss a technique - the quantum adversary upper bound - that uses the structure of quantum algorithms to gain insight into the quantum query complexity of Boolean functions. Using this bound, I show that there must exist an algorithm for a certain Boolean formula that uses a constant number of queries. Since the method is non-constructive, it does not give information about the form of the algorithm. After describing the technique and applying it to a class of functions, I will outline quantum algorithms that match the non-constructive bound. -
Light and matter: towards macroscopic quantum systems
Jacob Taylor Office of Science and Technology Policy
Advances in quantum engineering and material science are enabling new approaches for building systems that behave quantum mechanically on long time scales and large length scales. I will discuss how microwave and optical technologies in particular are leading to new domains of many-body physics, both classical and quantum, using photons and phonons as the constituent particles. Furthermore, I will highlight practical consequences of these advances, including improved force and acceleration sensing, efficient signal transduction, and topologically robust photonic circuits. Finally, I will consider how such large quantum systems may help us measure and constrain theories of quantum gravity and gravity-induced decoherence. -
Bulk-boundary correspondence in PEPS
Ignacio Cirac Max Planck Institute for Gravitational Physics - Albert Einstein Institute (AEI)
TBA -
Quantum mechanics as an operationally time symmetric probabilistic theory
Ognyan Oreshkov Université Libre de Bruxelles
The standard formulation of quantum mechanics is operationally asymmetric with respect to time reversal---in the language of compositions of tests, tests in the past can influence the outcomes of test in the future but not the other way around. The question of whether this represents a fundamental asymmetry or it is an artifact of the formulation is not a new one, but even though various arguments in favor of an inherent symmetry have been made, no complete time-symmetric formulation expressed in rigorous operational terms has been proposed. Here, we discuss such a possible formulation based on a generalization of the usual notion of test. We propose to regard as a test any set of events between an input and an output system which can be obtained by an autonomously defined laboratory procedure. This includes standard tests, as well as proper subsets of the complete set of outcomes of standard tests, whose realization may require post-selection in addition to pre-selection. In this approach, tests are not expected to be operations that are up to the choices of agents---the theory simply says what circuits of tests may occur and what the probabilities for their outcomes would be, given that they occur. By virtue of the definition of test, the probabilities for the outcomes of past tests can depend on tests that take place in the future. Such theories have been previously called non-causal, but here we revisit that notion of causality. Using the Choi-Jamiolkowski isomorphism, every test in that formulation, commonly regarded as inducing transformations from an input to an output system, becomes equivalent to a passive detection measurement applied jointly on two input systems---one from the past and one from the future. This is closely related to the two-state vector formalism, but it comes with a conceptual revision: every measurement is a joint measurement on two separate systems and not on one system described by states in the usual Hilbert space and its dual. We thus obtain a static picture of quantum mechanics in space-time or more general structures, in which every experiment is a local measurement on a global quantum state that generalizes the recently proposed quantum process matrix. The existence of two types of systems in the proposed formalism allows us to define causation in terms of correlations without invoking the idea of intervention, offering a possible answer to the problem of the meaning of causation. The framework is naturally compatible with closed time-like curves and other exotic causal structures. -
A universal Hamiltonian simulator: the full characterization
Gemma De Las Cuevas Universität Innsbruck
We show that if the ground state energy problem of a classical spin model is NP-hard, then there exists a choice parameters of the model such that its low energy spectrum coincides with the spectrum of \emph{any} other model, and, furthermore, the corresponding eigenstates match on a subset of its spins. This implies that all spin physics, for example all possible universality classes, arise in a single model. The latter property was recently introduced and called ``Hamiltonian completeness'', and it was shown that several different models had this property. We thus show that Hamiltonian completeness is essentially equivalent to the more familiar complexity-theoretic notion of NP-completeness. Additionally, we also show that Hamiltonian completeness implies that the partition functions are the same. These results allow us to prove that the 2D Ising model with fields is Hamiltonian complete, which is substantially simpler than the previous examples of complete Hamiltonians. Joint work with Toby Cubitt. -
A note on emergent classical behavior and approximations to decoherence functionals
Henrique Gomes University of Oxford
Although it can only be argued to have become consequential in the study of quantum cosmology, the question ``Why do we observe a classical world? " has been one of the biggest preoccupations of quantum foundations. In the consistent histories formalism, the question is shifted to an analysis of the telltale sign of quantum mechanics: superposition of states. In the consistent histories formalism, histories of the system which ``decohere", i.e. fall out of superposition or have negligible interference can be subjected to a notion of classical probability. In this paper we use an extension of Kirchoff's diffraction formula for wave functions on configuration spaces to give a different analysis and an approximation of decoherence. The Kirchoff diffraction formula lies conveniently at the midway between path integrals, wave equations, and classical behavior. By using it, we formulate an approximate dampening of the amplitude of superposition of histories. The dampening acts on each middle element of the fine-grained history {c_\alpha}, and is a function of the angle formed between {c_{n-1},c_n} and {c_n,c_{n+1}}, as classical trajectories in configuration space. As an example we apply the formalism to a modified gravity theory in the ADM gravitational conformal superspace. -
Asymptotically Optimal Topological Quantum Compiling
Vadym Kliuchnikov University of Waterloo
In a topological quantum computer, universality is achieved by braiding and quantum information is natively protected from small local errors. We address the problem of compiling single-qubit quantum operations into braid representations for non-abelian quasiparticles described by the Fibonacci anyon model. We develop a probabilistically polynomial algorithm that outputs a braid pattern to approximate a given single-qubit unitary to a desired precision. We also classify the single-qubit unitaries that can be implemented exactly by a Fibonacci anyon braid pattern and present an efficient algorithm to produce their braid patterns. Our techniques produce braid patterns that meet the uniform asymptotic lower bound on the compiled circuit depth and thus are depth-optimal asymptotically. Our compiled circuits are significantly shorter than those output by prior state-of-the-art methods, resulting in improvements in depth by factors ranging from 20 to 1000 for precisions ranging between 10^{−10} and 10^{−30}. -
Simulating Hamiltonian evolution with quantum computers
Richard Cleve Institute for Quantum Computing (IQC)
In 1982, Richard Feynman proposed the concept of a quantum computer as a means of simulating physical systems that evolve according to the Schrödinger equation. I will explain various quantum algorithms that have been proposed for this simulation problem, including my recent work (jointly with Dominic Berry and Rolando Somma) that significantly improves the running time as a function of the precision of the output data. -
Can ‘sub-quantum’ theories based on a background field escape Bell’s no-go theorem ?
Louis Vervoort Université de Montréal
In systems described by Ising-like Hamiltonians, such as spin-lattices, the Bell Inequality can be strongly violated. Surprisingly, these systems are both local and non-superdeterministic. They are local, because 1) they include only local, near-neighbor interaction, 2) they satisfy, accordingly, the Clauser-Horne factorability condition, and 3) they can violate the Bell Inequality also in dynamic Bell experiments. Starting from this result we construct an elementary hidden-variable model, based on a generalized Ising Hamiltonian, describing the interaction of the Bell-particles with a stochastic ‘background’ medium. We suggest that such a model is a simple version of a variety of recently developed ‘sub-quantum’ theories, by authors as Nelson, Adler, De la Pena, Cetto, Groessing, Khrennikov, all based on a background field. We investigate how the model might be turned into a realistic theory. Finally, it appears that background-based models can be tested and discriminated from quantum mechanics by a straightforward extension of existing experiments. -
Universal fault-tolerant quantum computation with only transversal gates and error correction
Transversal implementations of encoded unitary gates are highly desirable for fault-tolerant quantum computation. It is known, however, that transversal gates alone cannot be computationally universal. I will show that the limitation on universality can be circumvented using only fault-tolerant error correction, which is already required anyway. This result applies to ``triorthogonal'' stabilizer codes, which were recently introduced by Bravyi and Haah for state distillation. I will show that triothogonal codes admit transversal implementation of the controlled-controlled-Z gate, and then demonstrate a transversal Hadamard construction which uses error correction to preserve the codespace. I will also discuss how to adapt the distillation procedure of Bravyi and Haah to Toffoli gates, improving on existing Toffoli distillation schemes.