Format results
-
-
Generating and detecting multi-qubit GHZ states in circuit QED
Lev Bishop University of Maryland, College Park
-
-
-
Adiabatic Gate Teleportation and Topological Quantum Computing
Dave Bacon University of Washington
-
Innovations in Maximum Likelihood Quantum State Tomography
Scott Glancy National Institute of Standards and Technology
-
-
Quantum Cryptography
Daniel Gottesman University of Maryland, College Park
-
-
Jordan algebras and spectrality as tools for axiomatic characterization
Howard Barnum University of New Mexico
-
Quantum Bayesian: Pros and Cons
Christopher Timpson University of Oxford
-
-
What would a consistent instrumentalism about quantum mechanics be? Or, why Wigner's friendly after all.
Christopher Timpson University of Oxford
Instrumentalism about the quantum state is the view that this mathematical object does not serve to represent a component of (non-directly observable) reality, but is rather a device solely for making predictions about the results of experiments. One honest way to be such an instrumentalist is a) to take an ensemble view (= frequentism about quantum probabilities), whereby the state represents predictions for measurement results on ensembles of systems, but not individual systems and b) to assign some specific level for the quantum/classical cut. But what happens if one drops (b), or (a), or both, as some have been inclined to? Can one achieve a consistent view then? A major worry is illustrated by the Wigner's friend scenario: it looks as if it should make a measurable difference where one puts the cut, so how can it be consistent to slide it around (as, e.g., Bohr was wont to)? I'll discuss two main cases: that of Asher Peres' book, which adopts (a) but drops (b); and that of the quantum Bayesians Caves, Fuchs and Shack, which drops both. A view of Peres' sort can I, think, be made consistent, though may look a little strained; the quantum Bayesians' can too, though there are some subtleties (which I shall discuss) about how one should handle Wigner's friend. -
Generating and detecting multi-qubit GHZ states in circuit QED
Lev Bishop University of Maryland, College Park
I will present recent work [1] on preparation by measurement of Greenberger–Horne–Zeilinger (GHZ) states in circuit quantum electrodynamics. In particular, for the 3-qubit case, when employing a nonlinear filter on the recorded homodyne signal the selected states are found to exhibit values of the Bell–Mermin operator exceeding 2 under realistic conditions. I will discuss the potential of the dispersive readout to demonstrate a violation of the Mermin bound, and present a measurement scheme avoiding the necessity for full detector tomography. [1] Lev S Bishop et al 2009 New J. Phys. 11 073040 -
Spacetime can be simultaneously discrete and continuous, in the same way that information can.
Achim Kempf University of Waterloo
TBA -
A First-Principles Implementation of Scale Invariance Using Best Matching
We present a first-principles implementation of {\em spatial} scale invariance as a local gauge symmetry in geometry dynamics using the method of best matching. In addition to the 3-metric, the proposed scale invariant theory also contains a 3-vector potential A_k as a dynamical variable. Although some of the mathematics is similar to Weyl's ingenious, but physically questionable, theory, the equations of motion of this new theory are second order in time-derivatives. It is tempting to try to interpret the vector potential A_k as the electromagnetic field. We exhibit four independent reasons for not giving into this temptation. A more likely possibility is that it can play the role of ``dark matter''. Indeed, as noted in scale invariance seems to play a role in the MOND phenomenology. Spatial boundary conditions are derived from the free-endpoint variation method and a preliminary analysis of the constraints and their propagation in the Hamiltonian formulation is presented. -
Adiabatic Gate Teleportation and Topological Quantum Computing
Dave Bacon University of Washington
TBA -
Innovations in Maximum Likelihood Quantum State Tomography
Scott Glancy National Institute of Standards and Technology
At NIST we are engaged in an experiment whose goal is to create superpositions of optical coherent states (such superpositions are sometimes called "Schroedinger cat" states). We use homodyne detection to measure the light, and we apply maximum likelihood quantum state tomography to the homodyne data to estimate the state that we have created. To assist in this analysis we have made a few improvements to quantum state tomography: we have devised a new iterative method (that has faster convergence than R*\rho*R iterations) to find the maximum likelihood state, we have formulated a stopping criterion that can upper-bound the actual maximum likelihood, and we have implemented a bias-corrected resampling strategy to estimate confidence intervals. -
Betting on Quantum Theory
Grant Salton Amazon.com
Betting (or gambling) is a useful tool for studying decision-making in the face of [classical] uncertainty. We would like to understand how a quantum "agent" would act when faced with uncertainty about its [quantum] environment. I will present a preliminary construction of a theory of quantum gambling, motivated by roulette and quantum optics. I'll begin by reviewing classical gambling and the Kelly Criterion for optimal betting. Then I'll demonstrate a quantum optical version of roulette, and discuss some of the challenges and pitfalls in designing such analogues. Quantum agents have access to many more strategies than classical agents. Quantum strategies provide no advantage in classical roulette, but I'll show that a quantum agent can outperform a classical agent in quantum roulette. -
Quantum Cryptography
Daniel Gottesman University of Maryland, College Park
Information has always been valuable, never more so than in recent decades, and throughout history people have turned to cryptography in an attempt to keep important information secret. New technologies are now emerging based on the counterintuitive laws of quantum physics that govern the atomic scale. These technologies threaten cryptographic methods which are in widespread use today, but offer new quantum cryptographic protocols which could profoundly alter the world of cryptography. -
An Introduction to Quantum Information
Sonia Markes QED Consulting
A game that illustrates that quantum theory requires non-locality; an overview of the concept and basic mathematics of entanglement; and the concept of spin introduced via a Stern Gerlach set-up. -
Jordan algebras and spectrality as tools for axiomatic characterization
Howard Barnum University of New Mexico
The normalized-state spaces of finite-dimensional Jordan algebras constitute a relatively narrow class of convex sets that includes the finite-dimensional quantum mechanical and classical state spaces. Several beautiful mathematical characterizations of Jordan statespaces exist, notably Koecher's characterization as the bases of homogeneous self-dual cones, and Alfsen and Shultz's characterization based on the notion of spectral convex sets plus additional axioms. I will review the notion of spectral convex set and the Alfsen-Shultz characterization and discuss how these mathematical characterizationsof Jordan state spaces might be useful in developing accounts of quantum theory based on more operational principles, for example ones concerning information processing. If time permits, I will present joint work with Cozmin Ududec in which we define analogues of multiple-slit experiments in systems described by spectral convex state spaces, and obtain results on Sorkin's notion of higher-level interference in this setting. For example, we show that, like the finite-dimensional quantum systems which are a special case, Jordan state spaces exhibit only lowest-order (I_2 in Sorkin's hierarchy) interference. -
Quantum Bayesian: Pros and Cons
Christopher Timpson University of Oxford
The Quantum Bayesianism of Caves, Fuchs and Schack presents a distinctive starting point from which to attack the problem of axiomatising - or re-constructing - quantum theory. However, many have had the doubt that this starting point is itself already too radical. In this talk I will briefly introduce the position (it will be familiar to most, no doubt) and describe what I take to be its philosophical standpoint. More importantly, I shall seek to defend it from some bad objections, before going on to level some more substantive challenges. The background paper is: 0804.2047 on the arXiv. -
Candidates for Principles of Quantumness
Quantum Mechanics (QM) is a beautiful simple mathematical structure--- Hilbert spaces and operator algebras---with an unprecedented predicting power in the whole physical domain. However, after more than a century from its birth, we still don't have a "principle" from which to derive the mathematical framework. The situation is similar to that of Lorentz transformations before the advent of the relativity principle. The invariance of the physical law with the reference system and the existence of a limiting velocity, are not just physical principles: they are mandatory operational principles without which one cannot do experimental Physics. And it is a very seductive idea to think that QM could be derived from some other principle of such epistemological kind, which is either indispensable or crucial in dramatically reducing the experimental complexity. Indeed, the large part of the formal structure of QM is a set of formal tools for describing the process of gathering information in any experiment, independently on the particular physics involved. It is mainly a kind of "information theory", a theory about our knowledge of physical entities rather than about the entities themselves. If we strip off such informational part from the theory, what would be left should be a "principle of the quantumness" from which QM should be derived. In my talk I will analyze the consequences of two possible candidates for the principle of quantumness: 1) PFAITH: the existence of a pure bipartite state by which we can calibrate all local tests and prepare all bipartite states by local tests; 2) PURIFY: the existence of a purification for all states. We will consider the two postulates within the general context of probabilistic theories---also called test-theories. Within test-theories we will introduce the notion of "time-cascade" of tests, which entails the identifications "events=transformations" and "evolution=conditioning", and derive the general matrix-algebra representation of such theories, with particular focus on theories that satisfy the "local discriminability principle". Some of the concepts will be illustrated in some specific test-theories, including the usual cases of classical and quantum mechanics, the extended versions of the PR boxes, the so-called "spin-factors", and quantum mechanics on a real (instead of complex) Hilbert spaces. After the brief tutorial on test-theories, I will analyze all the consequences of the two candidate postulates. We will see how postulate PFAITH implies the "local observability principle" and the tensor-product structure for the linear spaces of states and effects, along with a remarkable list of additional features that are typically quantum, including purification for some states, the impossibility of bit commitment, and many others. We will see how the postulate is not satisfied by classical mechanics, and a stronger version of the postulate also exclude theories where we cannot have teleportation, e.g. PR-boxes. Finally we will analyze the consequences of postulate PURIFY, and show how it is equivalent to the possibility of dilating any probabilistic transformation on a system to a deterministic invertible transformation on the system interacting with an ancilla. Therefore PURIFY is equivalent to the general principle that "every transformation can be in-principle inverted, having sufficient control on the environment". Using a simple diagrammatic representation we will see how PURIFY implies general theorems as: 1) deterministic full teleportation; 2) inverting a transformation upon an input state (i.e. error-correction) is equivalent to the fact that environment and reference remain uncorrelated; 3) inverting some transformations by reading the environment; etc. We will see that some non-quantum theories (e.g. QM on real Hilbert spaces) still satisfy PURIFY. Finally I will address the problem on how to prove that a test-theory is quantum. One would need to show that also the "effects" of the theory---not just the transformations---make a matrix algebra. A way of deriving the "multiplication" of effects is to identify them with atomic events. This can be done assuming the atomicity of evolution in conjunction with the Choi-Jamiolkowski isomorphism. Suggested readings: 1. arXiv:0807.4383, to appear in "Philosophy of Quantum Information and Entanglement", Eds A. Bokulich and G. Jaeger (Cambridge University Press, Cambridge UK, in press) 2. G. Chiribella, G. M. D'Ariano, and P. Perinotti (in preparation) 3. G. M. D'Ariano, A. Tosini (in preparation