Format results
-
-
A histories perpective on bounding quantum correlations
Joe Henson BNP Paribas Asset Management London
-
Seeing is Believing: Direct Observation of a General Quantum State
Jeff Lundeen University of Ottawa
-
Maximal Privacy Without Coherence
Debbie Leung Institute for Quantum Computing (IQC)
-
-
Probabilistic protocols in quantum information
Joshua Combes University of Colorado Boulder
-
-
Bosonic particle-correlated states
Zhang Jiang University of New Mexico
-
-
Does the Quantum Particle know its own Energy?
Rafael Sorkin Perimeter Institute for Theoretical Physics
-
-
Noncontextuality without determinism and admissible (in)compatibility relations: revisiting Specker's parable.
Ravi Kunjwal Funds for Scientific Research - FNRS
-
Universal topological quantum computation from a superconductor/Abelian quantum Hall heterostructure
Roger Mong University of Pittsburgh
Non-Abelian anyons promise to reveal spectacular features of quantum mechanics that could ultimately provide the foundation for a decoherence-free quantum computer. The Moore-Read quantum Hall state and a (relatively simple) two-dimensional p+ip superconductor both support Ising non-Abelian anyons, also referred to as Majorana zero modes. Here we construct a novel two-dimensional superconductor in which charge-2e Cooper pairs are built from fractionalized quasiparticles, and like the Z3 Read-Rezayi state, harbors Fibonacci anyons that--unlike Ising anyons--allow for universal topological quantum computation solely through braiding. -
A histories perpective on bounding quantum correlations
Joe Henson BNP Paribas Asset Management London
There has recently been much interest in finding simple principles that explain the particular sets of experimental probabilities that are possible with quantum mechanics in Bell-type experiments. In the quantum gravity community, similar questions had been raised, about whether a certain generalisation of quantum mechanics allowed more than quantum mechanics in this regard. We now bring these two strands of work together to see what can be learned on both sides. -
Seeing is Believing: Direct Observation of a General Quantum State
Jeff Lundeen University of Ottawa
Central to quantum theory, the wavefunction is a complex distribution associated with a quantum system. Despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of it through its use to calculate measurement outcome probabilities through the Born Rule. Tomographic methods can reconstruct the wavefunction from measured probabilities. In contrast, I present a method to directly measure the wavefunction so that its real and imaginary components appear straight on our measurement apparatus. I will also present new work extending this concept to mixed quantum states. This extension directly measures a little-known proposal by Dirac for a classical analog to a quantum operator. Furthermore, it reveals that our direct measurement is a rigorous example of a quasi-probability phase-space (i.e. x,p) distribution that is closely related to the Q, P, and Wigner functions. Our direct measurement method gives the quantum state a plain and general meaning in terms of a specific set of simple operations in the lab. -
Maximal Privacy Without Coherence
Debbie Leung Institute for Quantum Computing (IQC)
Privacy and coherence have long been considered closely related properties of a quantum state. Indeed, a coherently transmitted quantum state is inherently private. Surprisingly, coherent quantum communication is not always required for privacy: there are quantum channels that are too noisy to transmit quantum information but it can send private classical information. Here, we ask how different the private classical and the quantum capacities can be. We present a class of channels N_d with input dimension d^2, quantum capacity Q(N_d) <= 1, and private classical capacity P(N_d) = log d. These channels asymptotically saturate an interesting inequality P(N) <= (log d_A + Q(N))/2 for any channel N with input dimension d_A, and capture the essence of privacy stripped of the confounding influence of coherence. -
Psi-epistemic models are exponentially bad at explaining the distinguishability of quantum states
Matthew Leifer Chapman University
The status of the quantum state is perhaps the most controversial issue in the foundations of quantum theory. Is it an epistemic state (representing knowledge, information, or belief) or an ontic state (a direct reflection of reality)? In the ontological models framework, quantum states correspond to probability measures over more fundamental states of reality. The quantum state is then ontic if every pair of pure states corresponds to a pair of measures that do not overlap, and is otherwise epistemic. Recently, several authors have derived theorems that aim to show that the quantum state must be ontic in this framework. Each of these theorems involve auxiliary assumptions of varying degrees of plausibility. Without such assumptions, it has been shown that models exist in which the quantum state is epistemic. However, the definition of an epistemic quantum state used in these works is extremely permissive. Only two quantum states need correspond to overlapping measures and furthermore the amount of overlap may be arbitrarily small. In order to provide an explanation of quantum phenomena such as no-cloning and the indistinguishability of pure states, the amount of overlap should be comparable to the inner product of the quantum states. In this talk, I show, without making auxiliary assumptions, that the ratio of overlap to inner product must go to zero exponentially in Hilbert space dimension for some families of states. This is done by connecting the overlap to Kochen-Specker noncontextuality, from which we infer that any contextuality inequality gives a bound on the ratio of overlap to inner product. -
Probabilistic protocols in quantum information
Joshua Combes University of Colorado Boulder
Probabilistic protocols in quantum information are an attempt to improve performance by occasionally reporting a better result than could be expected from a deterministic protocol. Here we show that probabilistic protocols can never improve performance beyond the quantum limits on the corresponding deterministic protocol. To illustrate this result we examine three common probabilistic protocols: probabilistic amplification, weak value amplification, and probabilistic metrology. In each of these protocols we show explicitly that the optimal deterministic protocol is better than the corresponding probabilistic protocol when the probabilistic nature of the protocol is correctly accounted for. -
Applications of Information Theory in Direct Sum and Direct Product Problems
A fundamental question in complexity theory is how much resource is needed to solve k independent instances of a problem compared to the resource required to solve one instance. Suppose solving one instance of a problem with probability of correctness p, we require c units of some resource in a given model of computation. A direct sum theorem states that in order to compute k independent instances of a problem, it requires k times units of the resource needed to compute one instance. A strong direct product theorem states that, with o(k • c) units of the resource, one can only compute all the k instances correctly with probability exponentially small in k. In this talk, I am going to present some of recent progress on direct sum and direct product theorems in the model of communication complexity and two-prover one-round games with information-theoretic approach. The talk is based on parts of my doctoral work. -
Bosonic particle-correlated states
Zhang Jiang University of New Mexico
Quantum many-body problems are notorious hard. This is partly because the Hilbert space becomes exponentially big with the particle number N. While exact solutions are often considered intractable, numerous approaches have been proposed using approximations. A common trait of these approaches is to use an ansatz such that the number of parameters either does not depend on N or is proportional to N, e.g., the matrix-product state for spin lattices, the BCS wave function for superconductivity, the Laughlin wave function for fractional quantum Hall effects, and the Gross-Pitaecskii theory for BECs. Among them the product ansatz for BECs has precisely predicted many useful properties of Bose gases at ultra-low temperature. As particle-particle correlation becomes important, however, it begins to fail. To capture the quantum correlations, we propose a new
set of states, which constitute a natural generalization of the product-state ansatz. Our state of N=d& times;n identical particles is derived by symmetrizing the n-fold product of a d-particle quantum state. For fixed d, the parameter space of our state does not grow with N. Numerically, we show that our ansatz gives the right description for the ground state and time evolution of the two-site Bose-Hubbard model. -
Hardness of correcting errors on a stabilizer code
Problems in computer science are often classified based on the scaling of the runtimes for algorithms that can solve the problem. Easy problems are efficiently solvable but often in physics we encounter problems that take too long to be solved on a classical computer. Here we look at one such problem in the context of quantum error correction. We will further show that no efficient algorithm for this problem is likely to exist. We will address the computational hardness of a decoding problem, pertaining to quantum stabilizer codes considering independent X and Z errors on each qubit. Much like classical linear codes, errors are detected by measuring certain check operators which yield an error syndrome, and the decoding problem consists of determining the most likely recovery given the syndrome. The corresponding classical problem is known to be NP-Complete, and a similar decoding problem for quantum codes is known to be NP-Complete too. However, this decoding strategy is not optimal in the quantum setting as it does not take into account error degeneracy, which causes distinct errors to have the same effect on the code. Here, we show that optimal decoding of stabilizer codes is computationally much harder than optimal decoding of classical linear codes, it is #P-Complete. -
Does the Quantum Particle know its own Energy?
Rafael Sorkin Perimeter Institute for Theoretical Physics
If a wave function does not describe microscopic reality then what does? Reformulating quantum mechanics in path-integral terms leads to a notion of ``precluded event" and thence to the proposal that quantal reality differs from classical reality in the same way as a set of worldlines differs from a single worldline. One can then ask, for example, which sets of electron trajectories correspond to a Hydrogen atom in its ground state and how they differ from those of an excited state. We address the analogous questions for simple model that replaces the electron by a particle hopping (in discrete time) on a circular lattice. -
Entanglement farming: Harnessing the properties of fixed points in quantum evolution
Eduardo Martin-Martinez University of Waterloo
We show that in certain generic circumstances the state of light of an optical cavity traversed by beams of atoms is naturally driven towards a non-thermal metastable state. This state can be such that successive pairs of unentangled particles sent through the cavity will reliably emerge significantly entangled thus providing a renewable source of quantum entanglement. Significant for possible experimental realizations is the fact that this entangling fixed point state of the cavity can be reached largely independently of the initial state in which the cavity was prepared. Our results suggest that reliable entanglement farming on the basis of such a fixed point state should be possible also in various other experimental settings, namely with the to-be-entangled particles replaced by arbitrary qudits and with the cavity replaced by a suitable reservoir system. -
Noncontextuality without determinism and admissible (in)compatibility relations: revisiting Specker's parable.
Ravi Kunjwal Funds for Scientific Research - FNRS
The purpose of this talk is twofold: First, following Spekkens, to motivate noncontextuality as a natural principle one might expect to hold in nature and introduce operational noncontextuality inequalities motivated by a contextuality scenario first considered by Ernst Specker. These inequalities do not rely on the assumption of outcome-determinism which is implicit in the usual Kochen-Specker (KS) inequalities. We argue that they are the appropriate generalization of KS inequalities, serving as a test for the possibility of noncontextual explanations of experimental data. This is very much in the spirit of Bell inequalities, which provide theory-independent tests for local hidden variable explanations of experimental data without relying on the assumption of outcome-determinism. The second purpose is to point out a curious feature of quantum theory, motivated by the connections between (in)compatibility and (non)contextuality: namely, that it admits all conceivable (in)compatibility relations between observables.