Format results
-
Statistical Prediction of the Outcome of a Noncooperative Game
David Wolpert National Aeronautics and Space Administration
-
Physical Limits of Inference
David Wolpert National Aeronautics and Space Administration
-
From Information Geometry to Quantum Theory
Philip Goyal State University of New York (SUNY)
-
The Open Universe: Toward a Post-Reductionist Science
Stuart Kauffman Santa Fe Institute
-
-
-
Wiggling Hilbert Space
Cozmin Ududec Government of the United Kingdom
-
-
The intersection of general relativity and quantum mechanics
Keye Martin United States Naval Research Laboratory
-
-
On the epistemic view: Strengths and weakneses of Spekkens’ toy theory
Michael Skotiniotis University of Calgary
-
Extending Standard Quantum Interpretation by Quantum Set Theory
Masanao Ozawa Nagoya University
-
Statistical Prediction of the Outcome of a Noncooperative Game
David Wolpert National Aeronautics and Space Administration
Many statistics problems involve predicting the joint strategy that will be chosen by the players in a noncooperative game. Conventional game theory predicts that the joint strategy will satisfy an ``equilibrium concept\'\'. The relative probabilities of the joint strategies satisfying the equilibrium concept are not given, and all joint strategies that do not satisfy it are given probability zero. As an alternative, I view the prediction problem as one of statistical inference, where the ``data\'\' includes the details of the noncooperative game. This replaces conventional game theory\'s focus on how to specify a set of equilibrium joint strategies with a focus on how to specify a density function over joint strategies. I explore a Bayesian version of such a Predictive Game Theory (PGT) that provides a posterior density over joint strategies. It is based on the the entropic prior and on a likelihood that quantifies the rationalities of the players. The Quantal Response Equilibrium (QRE) is a popular game theory equilibrium concept parameterized by player rationalities. I show that for some games the local peaks of the posterior density over joint strategies approximate the associated QRE\'s, and derive the associated correction terms. I also discuss how to estimate parameters of the likelihood from observational data, and how to sample from the posterior. I end by showing how PGT can be used to specify a {it{unique}} equilibrium for any noncooperative game, thereby providing a solution to a long-standing problem of conventional game theory. -
Physical Limits of Inference
David Wolpert National Aeronautics and Space Administration
I show that physical devices that perform observation, prediction, or recollection share an underlying mathematical structure. I call devices with that structure ``inference devices\'\'. I present a set of existence and impossibility results concerning inference devices. These results hold independent of the precise physical laws governing our universe. In a limited sense, the impossibility results establish that Laplace was wrong to claim that even in a classical, non-chaotic universe the future can be unerringly predicted, given sufficient knowledge of the present. Alternatively, these impossibility results can be viewed as a non-quantum mechanical ``uncertainty principle\'\'. Next I explore the close connections between the mathematics of inference devices and of Turing Machines. I end by informally discussing the philosophical implications of these results, e.g., for whether the universe ``is\'\' a computer. -
From Information Geometry to Quantum Theory
Philip Goyal State University of New York (SUNY)
The unparalleled empirical success of quantum theory strongly suggests that it accurately captures fundamental aspects of the workings of the physical world. The clear articulation of these aspects is of inestimable value --- not only for the deeper understanding of quantum theory in itself, but for its further development, particularly for the development of a theory of quantum gravity. Recently, there has been growing interest in elucidating these aspects by expressing, in a less abstract mathematical language, what we think quantum theory might be telling us about how nature works, and trying to derive, or reconstruct, quantum theory from these postulates. In this talk, I describe a simple reconstruction of the finite- dimensional quantum formalism. The derivation takes places with a classical probabilistic framework equipped with the information (or Fisher-Rao) metric, and rests upon a small number of elementary ideas (such as complementarity and global gauge invariance). The complex structure of quantum formalism arises very naturally. The derivation provides a number of non-trivial insights into the quantum formalism, such as the extensive nature of the role of information geometry in determining the quantum formalism, the importance of global gauge invariance, and the importance (or lack thereof) of assumptions concerning separated systems. -
-
Hawking Boxes and Invariant Sets - A New Look at the Foundations of Quantum Theory and the Associated Role of Gravity
Tim Palmer University of Oxford
We start by studying the non-computational geometry of fractionally-dimensioned measure-zero dynamically-invariant subsets of phase space, associated with certain deterministic nonlinear dissipative dynamical systems. Then, by studying the asymptotic states of the Hawking Box, the existence of such invariant subsets is conjectured for gravitationally-bound systems. The argument hinges around the phase-space properties of black holes. Like Penrose, it is assumed that phase-space volumes shrink when the contents of the Hawking Box contain black holes. However, unlike Penrose, we do not argue for any corresponding phase-space divergence when the Box does not contain black holes. We now make the hypothesis that these invariant phase-space subsets play a primitive role in fundamental physics; specifically that the state of the universe (“reality”) lies on such an invariant subset (now and hence forever). Attention is focussed on the implications of this hypothesis for the foundations of quantum theory. For example, what are referred to as “measurements” of the quantum state, are defined in terms of symbolic dynamics on the invariant set, relative to some partition of the invariant set. This immediately leads to the notion that any theory which treats these invariant sets as primitive, must be contextual (since counterfactual perturbations almost certainly take states off the measure-zero invariant set and hence to “unreal” regions of phase space where the symbolic partition is undefined). This in turn leads to a new perspective, both on the foundations of quantum theory and on the role of gravity in formulating these foundations. In particular, a measurement-free Neo-Copenhagen Interpretation of quantum theory, based on the Invariant Set Hypothesis will be presented. -
Dirac's penumbra: constraints and gauge transformations in reparametrization invariant theories
Brendan Foster Utrecht University
A simple theorem of Dirac identifies primary first-class constraints as generators of transformations, \'that do not affect the physical state\'. This result has profound implications for the definition of physical states and observables in the quantization of constrained systems, and leads to one aspect of the infamous \'problem of time\' in quantum gravity. As I will discuss, a close look at the theorem reveals that it depends crucially on the assumption of an absolute time. This assumption does not hold for reparametrization invariant theories, such as parametrized particle mechanics, and in these theories, the primary Hamiltonian constraint does generate physical change. I will also look at just what Dirac did and did not say about this case, and what has been said by reviewers since.
-
Wiggling Hilbert Space
Cozmin Ududec Government of the United Kingdom
After using the complex Hilbert space formalism for quantum theory for so long, it is very easy to begin to take for granted features like projection operators and the projection postulate, the algebra of observables, symmetric transition probabilities, linear evolution, etc.... Over the past 50 years there have been many attempts to gain a better understanding of this formalism by reconstructing it from different kinds of (sometimes) physically motivated assumptions. By looking at how the above features are motivated and used in different reconstructions, it becomes clear just how special and restrictive many of them are. The question is then what a theory which does not have some of these features looks like. Another interesting question is whether there are any reasons to be suspicious of postulating them in reconstructions or when trying to generalize or apply the quantum formalism to untested situations. -
Local quantum physics versus (relativistic) quantum mechanics: thermal- versus information theoretic- entanglement and the origin of the area law for \"localization entropy\".
Bert Schroer Freie Universität Berlin
The fundamentally different localization concepts of QT, i.e. the Born-(Newton-Wigner) localization of (relativistic) QM as compared with the causal localization (modular localization) of QFT, lead to significant differences in the nature of local observables and affiliated states. This in turn results in a rather sharp distinction between a tensor-factorization and information-theoretic entanglement in QM on the one hand, and a more radical \"thermal entanglement” responsible for an area law for localization entropy. These surprising differences can be traced back to the very different nature of the localized operator algebras in QFT: they are all isomorphic (independent of the localization region) to one abstract \"monad\" (borrowing terminology from Leibniz) and the full reality of QFT (including its symmetries) is contained in the positioning of a finite rather small number (2 for chiral theories, 6 for d=1+3,...) within a joint Hilbert space. It is an important open question to what extend such positional characterizations (where the individual monads are void of any physical properties which reside fully in their relative placements) can be generalized to CST or QG. -
The intersection of general relativity and quantum mechanics
Keye Martin United States Naval Research Laboratory
Domains were introduced in computer science in the late 1960\'s by Dana Scott to provide a semantics for the lambda calculus (the lambda calculus is the basic prototype for a functional programming language i.e. ML). The study of domains with measurements was initiated in the speaker\'s thesis: a domain provides a qualitative view of information expressed in part by an \'information order\' and a measurement on a domain expresses a quantitative view of information with respect to the underlying qualitative aspect. The theory of domains and measurements was initially introduced to provide a first order model of computation, one in which a computation is viewed as a process that evolves in a space of informatic objects, where processes have informatic rates of change determined by the manner in which they manipulate information. There is a domain of binary channels with capacity as a measurement. There is a domain of finite probability distributions with entropy as a measurement. There is a domain of quantum mixed states with entropy as a measurement. There is a domain of spacetime intervals with global time as a measurement. In this setting, similarities between QM and GR emerge, but also some important differences. In a domain, if we write x <= y, then it means that x carries information about y, while x << y is a stronger relation that means x carries *essential* information about y. In GR, the domain theoretic relation << can be proven to be timelike causality. It possesses stronger mathematical properties than << does in QM. However, by an application of the maximum entropy principle, we can restrict the mixed states in consideration and this difference is removed: the domains of events and mixed states are both globally hyperbolic -- where globally hyperbolic is a purely order theoretic idea that just happens to coincide with the usual notion in the case of GR. Along the way, we will see domain theoretic ways of distinguishing between the Newtonian and relativistic notions of time, how to reconstruct the topology and geometry of spacetime in a purely order theoretic manner beginning from only a countable set, see that the Holevo capacity of a unital qubit channel is determined by the largest value of its informatic derivative and have reason to wonder if distance can be defined as the amount of information (capacity) that can be transmitted between two points. -
A candidate of a psi-epistemic theory
In deBroglie-Bohm theory the quantum state plays the role of a guiding agent. In this seminar we will explore if this is a universal feature shared by all hidden variable theories or merely a peculiar feature of deBroglie-Bohm theory. We present the bare bones of a model in which the quantum state represents a probability distribution and does not act as a guiding agent. The theory is also psi-epistemic according to Spekken\'s and Harrigan\'s definition. For simplicity we develop the model for a 1D discrete lattice but the generalization to higher dimensions is straightforward. The ontic state consists of a definite particle position and in addition possible non-local links between spatially separated lattice points. These non-local links comes in two types: directed links and non-directed links. Entanglement manifests itself through these links. Interestingly, this ontology seems to be the simplest possible and immediately suggested by the structure of quantum theory itself. For N lattice points there are N*3^(N(N-1)) ontic states growing exponentially with the Hilbert space dimension N as expected. We further require that the evolution of the probability distribution on the ontic state space is dictated by a master equation with non-negative transition rates. It is then easy to show that one can reproduce the Schroedinger equation if an only if there are positive solutions to a gigantic system of linear equations. This is a highly non-trivial problem and whether there exists such positive solutions or not is still not clear to me. Alternatively one can view this set of linear equations as constraints on the possible types of Hamiltonians. We end by speculating how one might incorporate gravity into this theory by requiring permutation invariance of the dynamical evolution law. -
On the epistemic view: Strengths and weakneses of Spekkens’ toy theory
Michael Skotiniotis University of Calgary
We investigate the strengths and weaknesses of the Spekkens toy model for quantum states. We axiomatize the Spekkens toy model into a set of five axioms, regarding valid states, transformations, measurements and composition of systems. We present two relaxations of the Spekkens toy model, giving rise to two variant toy theories. By relaxing the axiom regarding valid transformations a group of toy operations is obtained that is equivalent to the projective extended Clifford Group for one and two qubits. However, the physical state of affairs resulting from this relaxation is undesirable, violating the desideratum that single toy bit operations must compose under the tensor product. The second variant toy theory is obtained by relaxing the axioms regarding valid states and measurements, resulting in a toy model that exhibits the Kochen-Specker property. Like the previous toy model, the relaxation renders the toy model physically undesirable. Therefore, we claim that the Spekkens toy model is optimal; altering its axioms does not yield a better epistemic description of quantum theory. This work is a collaboration with Gilad Gour, Aidan Roy and Barry C. Sanders. -
Extending Standard Quantum Interpretation by Quantum Set Theory
Masanao Ozawa Nagoya University
Set theory provides foundations of mathematics in the sense that all the mathematical notions like numbers, functions, relations, structures are defined in the axiomatic set theory called ZFC. Quantum set theory naturally extends ZFC to quantum logic. Hence, we can expect that quantum set theory provides mathematics based on quantum logic. In this talk, I will show a useful application of quantum set theory to quantum mechanics based on the fact that the real numbers constructed in quantum set theory exactly corresponds to the quantum observables. The standard formulation of quantum mechanics answers the question as to in what state an observable A has the value in an interval I. However, the question is not answered as to in what state two observables A and B have the same value. The notion of equality between the values of observables will play many important roles in foundations of quantum mechanics. The notion of measurement of an observable relies on the condition that the observable to be measured and the meter after the measurement should have the same value. We can define the notion of quantum disturbance through the condition whether the values of the given observable before and after the process is the same. It is shown that all the observational propositions on a quantum system corresponds to some propositions in quantum set theory and the equality relation naturally provides the proposition that two observables have the same value. It has been broadly accepted that we cannot speak of the values of quantum observables without assuming a hidden variable theory. However, quantum set theory enables us to do so without assuming hidden variables but alternatively under the consist use of quantum logic, which is more or less considered as logic of the superposition principle. [1] M. Ozawa, Transfer principle in quantum set theory, J. Symbolic Logic 72, 625-648 (2007), online preprint: http://arxiv.org/abs/math.LO/0604349. [2] M. Ozawa, Quantum perfect correlations, Ann. Phys. (N.Y.) 321, 744--769 (2006), online preprint: LANL quant-ph/0501081.