Towards the identification of Quantum Theory: Operational Approach
Sutapa Saha Indian Statistical Institute
Sutapa Saha Indian Statistical Institute
Selman Ipek Bilkent University
Chris Waddell Perimeter Institute for Theoretical Physics
Isaac Friend University of Oxford
Ningping Cao University of Waterloo
Sutapa Saha Indian Statistical Institute
In spite of its immense importance in the present-day information technology, the foundational aspects of quantum theory (QT) remain still elusive. In particular, there is no such set of physically motivated axioms which can answer why Hilbert space formalism is the only natural choice to describe the microscopic world. Hence, to shed light on the unique formalism of QT, two different operational frameworks will be described in the primitive of various convex operational theories. The first one refers to a kinematical symmetry principle which would be proposed from the perspective of single copy state discrimination and it would be shown that this symmetry holds for both classical and QT – two successful descriptions of the physical world. On the other hand, studying a wide range of convex operational theories, namely the General Probabilistic Theories (GPTs) with polygonal state spaces, we observe the absence of such symmetry. Thus, the principle deserves its own importance to mark a sharp distinction between the physical and unphysical theories. Thereafter, a distributed computing scenario will be introduced for which the other convex theories except the QT turn out to be equivalent to the classical one even though the theories possess more exotic state and effect spaces. We have coined this particular operational framework as ‘Distributed computation with limited communication’ (DCLC). Furthermore, it will be shown that the distributed computational strength of quantum communication will be justified in terms of a stronger version of this task, namely the ‘Delayed choice distributed computation with limited communication’ (DC2LC). The proposed task thus provides a new approach to operationally single out quantum theory in the theory-space and hence promises a novel perspective towards the axiomatic derivation of Hilbert space quantum mechanics.
References:
Phys. Rev. A (Rapid)100, 060101 (2019)
Ann. Phys.(Berlin)2020,532, 2000334 (2020)
arXiv:2012.05781 [quant-ph](2020)
Zoom link: https://pitp.zoom.us/j/92924188227?pwd=ODJYQXVoaUtzZmZIdFlmcUNIV3Rmdz09
Cihan Okay Bilkent University
Classical simulation algorithms provide a rigorous ground for investigating quantum resources responsible for quantum speedup. In my talk, I will consider one such algorithm provided by Lambda polytopes. These polytopes are defined to be the polar dual of the stabilizer polytopes and can be used to provide a hidden variable model for finite-dimensional quantum theory. This hidden variable model can be turned into a classical algorithm that can simulate any quantum computation. The efficiency of this algorithm depends on the combinatorial structure of the polytope. In general, which subset of the vertices gives rise to efficient simulation is an open problem. I will describe some of the known classes of vertices and available methods for studying this polytope.
Zoom link: https://pitp.zoom.us/j/95216680309?pwd=aGlIN2NtZVRtczdHcXl5RzgzQTlOdz09
In the context of irreversible dynamics, the meaning of the reverse of a physical evolution can be quite ambiguous. It is a standard choice to define the reverse process using Bayes' theorem, but, in general, this is not optimal with respect to the relative entropy of recovery. In this work we explore whether it is possible to characterise an optimal reverse map building from the concept of state retrieval maps. In doing so, we propose a set of principles that state retrieval maps should satisfy. We find out that the Bayes inspired reverse is just one case in a whole class of possible choices, which can be optimised to give a map retrieving the initial state more precisely than the Bayes rule. Our analysis has the advantage of naturally extending to the quantum regime. In fact, we find a class of reverse transformations containing the Petz recovery map as a particular case, corroborating its interpretation as a quantum analogue of the Bayes retrieval.
Finally, we present numerical evidence showing that by adding a single extra axiom one can isolate for classical dynamics the usual reverse process derived from Bayes' theorem.
Zoom link: https://pitp.zoom.us/j/93589286500?pwd=dkZuRzR0SlhVd1lPdGNOZWFYQWtRZz09
Zhu-Xi Luo Harvard University
Experimental realizations of long-range entangled states such as quantum spin liquids are challenging due to numerous complications in solid state materials. Digital quantum simulators, on the other hand, have recently emerged as a promising platform to controllably simulate exotic phases. I will talk about a constructive design of long-range entangled states in this setting, and exploit competing measurements as a new source of frustration to generate spin liquid. Specifically, we consider random projective measurements of the anisotropic interactions in the Kitaev honeycomb model. The monitored trajectories can produce analogues of the two phases in the original Kitaev model: (i) a topologically-ordered phase with area-law entanglement and two protected logical qubits, and (ii) a “critical” phase with a logarithmic violation of area-law entanglement and long-range tripartite entanglement. A Majorana parton description permits an analytic understanding of these two phases through a classical loop model. Extensive numerical simulations of the monitored dynamics confirm our analytic predictions. This talk is based on https://arxiv.org/abs/2207.02877.
Zoom link: https://pitp.zoom.us/j/99600719755?pwd=a0pOWlliU0swVDdGYnhxaGFGNkJSdz09
Indrajit Sen Chapman University
Non-normalizable quantum states are usually discarded as mathematical artefacts in quantum mechanics. However, such states naturally occur in quantum gravity as solutions to physical constraints. This suggests reconsidering the interpretation of such states. Some of the existing approaches to this question seek to redefine the inner product, but this arguably leads to further challenges.
In this talk, I will propose an alternative interpretation of non-normalizable states using pilot-wave theory. First, I will argue that the basic conceptual structure of the theory contains a straightforward interpretation of these states. Second, to better understand such states, I will discuss non-normalizable states of the quantum harmonic oscillator from a pilot-wave perspective. I will show that, contrary to intuitions from orthodox quantum mechanics, the non-normalizable eigenstates and their superpositions are bound states in the sense that the pilot-wave velocity field vy→0 at large ±y. Third, I will introduce a new notion of equilibrium, named pilot-wave equilibrium, and use it to define physically-meaningful equilibrium densities for such states. I will show, via an H-theorem, that an arbitrary initial density with compact support relaxes to pilot-wave equilibrium at a coarse-grained level, under assumptions similar to those for relaxation to quantum equilibrium. I will conclude by discussing the implications for pilot-wave theory, quantum gravity and quantum foundations in general.
Based on:
I. Sen. "Physical interpretation of non-normalizable harmonic oscillator states and relaxation to pilot-wave equilibrium" arXiv:2208.08945 (2022)
Zoom link: https://pitp.zoom.us/j/93736627504?pwd=VGtxZE5rTFdnT1dqZlFRWTFvWlFQUT09
Quantum correlations in general and quantum entanglement in particular embody both our continued struggle towards a foundational understanding of quantum theory as well as the latter’s advantage over classical physics in various information processing tasks. Consequently, the problems of classifying (i) quantum states from more general (non-signalling) correlations, and (ii) entangled states within the set of all quantum states, are at the heart of the subject of quantum information theory.
In this talk I will present two recent results (from https://journals.aps.org/pra/abstract/10.1103/PhysRevA.106.062420 and https://arxiv.org/abs/2207.00024) that shed new light on these problems, by exploiting a surprising connection with time in quantum theory:
First, I will sketch a solution to problem (i) for the bipartite case, which identifies a key physical principle obeyed by quantum theory: quantum states preserve local time orientations—roughly, the unitary evolution in local subsystems.
Second, I will show that time orientations are intimately connected with quantum entanglement: a bipartite quantum state is separable if and only if it preserves arbitrary local time orientations. As a variant of Peres's well-known entanglement criterion, this provides a solution to problem (ii).
Zoom link: https://pitp.zoom.us/j/97607837999?pwd=cXBYUmFVaDRpeFJSZ0JzVmhSajdwQT09
Indrajit Sen Chapman University
Superdeterminism has received a recent surge of attention in the foundations community. A particular superdeterministic proposal, named Invariant-set theory, appears to bring ideas from several diverse fields (eg. number theory, chaos theory etc.) to quantum foundations and provides a novel justification for the choice of initial conditions in terms of state-space geometry. However, the lack of a concrete hidden-variable model makes it difficult to evaluate the proposal from a foundational perspective.
In this talk, I will critically analyse this superdeterministic proposal in three steps. First, I will show how to build a hidden-variable model based on the proposal's ideas. Second, I will analyse the properties of the model and show that several arguments that appear to work in the proposal (on counter-factual measurements, non-commutativity etc.) fail when considered in the model. Further, the model is not only superdeterministic but also nonlocal, $\psi$-ontic and contains redundant information in its bit-string. Third, I will discuss the accuracy of the model in representing the proposal. I will consider the arguments put forward to claim inaccuracy and show that they are incorrect. My results lend further support to the view that superdeterminism is unlikely to solve the puzzle posed by the Bell correlations.
Based on the papers:
1. I. Sen. "Analysis of the superdeterministic Invariant-set theory in a hidden-variable setting." Proc. R. Soc. A 478.2259 (2022): 20210667.
2. I. Sen. "Reply to superdeterminists on the hidden-variable formulation of Invariant-set theory." arXiv:2109.11109 (2021).
Zoom link: https://pitp.zoom.us/j/99415427245?pwd=T3NOWUxKTENnMThRVEd3ZTRzU3ZKZz09
Selman Ipek Bilkent University
Central to many of the paradoxes arising in quantum theory is that the act of measurement cannot be understood as merely revealing the pre-existing values of some hidden variables, a phenomenon known as contextuality. In the past few years quantum contextuality has been formalized in a variety of ways; operation-theoretic, sheaf-theoretic, (hyper)graph-theoretic, and cohomological. In this seminar we will discuss the simplicial approach to contextuality introduced in arXiv:2204.06648, which builds off the earlier sheaf-theoretic approach of Abramsky-Brandenberger (arXiv:1102.0264) and the cohomological approach of Okay, et al. (arXiv:1701.01888). In the simplicial approach measurement scenarios and their statistics can be modeled topologically as simplicies using the theory of simplicial sets. The connection to topology provides an additional analytical handle, allowing for a rigorous study of both state-dependent and state-independent contextuality. Using this formalism we present a novel topological proof of Fine's theorem for characterizing noncontextuality in Bell scenarios.
Zoom link: https://pitp.zoom.us/j/93748699892?pwd=SVhVaTdoRmlwaGdCZVdIWVlKTktjQT09
Chris Waddell Perimeter Institute for Theoretical Physics
A large class of Λ < 0 cosmologies have big-bang / big crunch spacetimes with time-symmetric backgrounds and asymptotically AdS Euclidean continuations suggesting a possible holographic realization. We argue that these models generically have time-dependent scalar fields, and these can lead to realistic cosmologies at the level of the homogeneous background geometry, with an accelerating phase prior to the turnaround and crunch. We first demonstrate via explicit effective field theory examples that models with an asymptotically AdS Euclidean continuation can also exhibit a period of accelerated expansion without fine tuning. We then show that certain significantly more tuned examples can give predictions arbitrarily close to a ΛCDM model. Finally, we demonstrate via an explicit construction that the potentials of interest can arise from a superpotential, thus suggesting that these solutions may be compatible with an underlying supersymmetric theory.
This talk is based on 2212.00050.
Zoom link: https://pitp.zoom.us/j/93499736007?pwd=Qmw5cmZERUN3UmtwTzdKcEdXejJ5UT09
Zixia Wei Kyoto University
Typicality, the feature that almost any microstate living in the microcanonical subspace cannot be locally distinguished from a thermal ensemble, lies at the fundamental part of statistical physics. However, one may wonder if there exists a sufficient amount of orthogonal atypical states to account for the whole entropy.
In this talk, we show that, in some physical systems, there exists a sufficient amount of certain orthogonal atypical states to account for the leading order of the entropy in the following two scenarios, by finding proper upper bounds of the entanglement of formation (EoF) for each case and applying other techniques from quantum information theory.
In the first scenario, the physical system under consideration is AdS black holes at the semiclassical limit G_N —> 0. In this case, microcanonical subspace is the subspace formed by the black hole microstates, and typical states are usually considered to have a smooth horizon as well as the black hole interior. We consider a class of atypical states called disentangled states which have large entanglement deficits compared to typical states such that they cannot have smooth horizons. In this scenario, we use a geometric quantity called entanglement wedge cross section to give upper bounds to EoF.
In the second scenario, we consider generic quantum many-body systems with short-ranged interactions at the standard thermodynamic limit V —> ∞. In this case, it is known that typical microstates have volume law entanglement. We consider area-law entangled microstates as atypical states. We use reflected entropy to give upper bounds to EoF.
We will also discuss the relations of our results with the additivity conjectures and atypical black hole microstate counting.
This talk is based on 2211.11787.
Zoom link: https://pitp.zoom.us/j/94823687961?pwd=V3QrSjUrTklheG1iU0RsckwzQmRTUT09
Isaac Friend University of Oxford
In the new wave of quantum foundations activity with its indirect approach to problems of fundamental ontology, individual explicit positions of informational immaterialism are replaced by a shared "soft informatic realism" that governs research practice, encouraging conflation of theories of information processes and theories of physical processes, and disregard for the microphysical dynamics effecting a given information process. This kind of abstraction, indispensable in the formulation of enlightening no-go theorems, can become problematic when imported to certain other projects, including recently popular investigations of quantum causal structure. I shall provide examples, describe ramifications for the efficiency of knowledge production in quantum foundations, and consider when features of quantum information processing can legitimately be called informatic features of quantum physics.
Zoom link: https://pitp.zoom.us/j/93415836509?pwd=MXJLZVVzMnZjcWFQSWM0dmg5czE3dz09
Ningping Cao University of Waterloo
Error-correcting codes were invented to correct errors on noisy communication channels. Quantum error correction (QEC), however, has a wider range of uses, including information transmission, quantum simulation/computation, and fault-tolerance. These invite us to rethink QEC, in particular, the role that quantum physics plays in terms of encoding and decoding. The fact that many quantum algorithms, especially near-term hybrid quantum-classical algorithms, only use limited types of local measurements on quantum states, leads to various new techniques called Quantum Error Mitigation (QEM). We examine the task of QEM from several perspectives. Using some intuitions built upon classical and quantum communication scenarios, we clarify some fundamental distinctions between QEC and QEM. We then discuss the implications of noise invertibility for QEM, and give an explicit construction called Drazin-inverse for non-invertible noise, which is trace-preserving while the commonly-used MoorePenrose pseudoinverse may not be. Finally, we study the consequences of having imperfect knowledge about system noise and derive conditions when noise can be reduced using QEM.
Zoom link: https://pitp.zoom.us/j/91543402893?pwd=b09IS3VWNk5KZi8ya3gzSmRKRFJidz09