Entropy modulo p and quantum information
Maris Ozols University of Amsterdam
Maris Ozols University of Amsterdam
George Moreno Instituto Federal do Rio Grande do Norte (IFRN)
Sutapa Saha Indian Statistical Institute
Vojtěch Havlíček IBM (United States)
Hsin-Yuan Huang California Institute of Technology (Caltech)
Nicetu Tibau Vidal University of Oxford
Lorenzo Catani Chapman University
Jacopo Surace Perimeter Institute for Theoretical Physics
Elie Wolfe Perimeter Institute for Theoretical Physics
Robert Spekkens Perimeter Institute for Theoretical Physics
Steffen Lauritzen University of Copenhagen
Robin Evans University of Oxford
Mirjam Weilenmann Institute for Quantum Optics and Quantum Information (IQOQI) - Vienna
Marina Maciel Ansanelli Perimeter Institute for Theoretical Physics
Maris Ozols University of Amsterdam
Tom Leinster recently introduced a curious notion of entropy modulo p (https://arxiv.org/abs/1903.06961). While entropy has a certain meaning in information theory and physics, mathematically it is simply a function with certain properties. Stating these as axioms, the function is unique. Surprisingly, Leinster shows that a function obeying the same axioms can also be found for "probability distributions" over a finite field, and this function is unique too.
In quantum information, mutually unbiased bases is an important set of measurements and an example of a quantum design. While in odd prime power dimensions their construction is based on a finite field, in dimension 2^n it relies on an unpleasant Galois ring. I will replace this ring by length-2 Witt vectors whose arithmetic involves only finite field operations and Leinster's entropy mod 2. This expresses qubit mutually unbiased bases entirely in terms of a finite field and allows deriving an explicit unitary correspondence between them and the affine plane over this field.
Zoom link: https://pitp.zoom.us/j/94032116379?pwd=TTI1RnByQnFuVHp1MytFUlJxckM4Zz09
George Moreno Instituto Federal do Rio Grande do Norte (IFRN)
Nonclassicality, as witnessed by the incapacity of Classical Causal Theory (CCT) of explaining a system's behavior given its causal structure, come to be one of the hottest topics in Quantum Foundations over the last decades, a movement that was motivated both by its vast range of practical applications and by the powerful insights it provides about the rules of the quantum world. Among the many attempts at understanding/quantifying this phenomenon, we highlight the idea of inquiring how further would it be necessary to relax the causal structure associated with a given system in order to have its nonclassical behavior explained by CCT. More recently, we showed that the relaxation demanded to explain the behavior of a subset of variables in a given experiment may not be allowed by the embedding causal structure when considering the behavior of the remaining variables, which led to a new way of witnessing nonclassicality. In this seminar, we discuss a new way of quantifying this incompatibility and possible generalizations of this approach to different scenarios.
Zoom link: https://pitp.zoom.us/j/96051313203?pwd=bFNhOEhkSVhXWk8yd1hVZWVVa0U4UT09
Sutapa Saha Indian Statistical Institute
In spite of its immense importance in the present-day information technology, the foundational aspects of quantum theory (QT) remain still elusive. In particular, there is no such set of physically motivated axioms which can answer why Hilbert space formalism is the only natural choice to describe the microscopic world. Hence, to shed light on the unique formalism of QT, two different operational frameworks will be described in the primitive of various convex operational theories. The first one refers to a kinematical symmetry principle which would be proposed from the perspective of single copy state discrimination and it would be shown that this symmetry holds for both classical and QT – two successful descriptions of the physical world. On the other hand, studying a wide range of convex operational theories, namely the General Probabilistic Theories (GPTs) with polygonal state spaces, we observe the absence of such symmetry. Thus, the principle deserves its own importance to mark a sharp distinction between the physical and unphysical theories. Thereafter, a distributed computing scenario will be introduced for which the other convex theories except the QT turn out to be equivalent to the classical one even though the theories possess more exotic state and effect spaces. We have coined this particular operational framework as ‘Distributed computation with limited communication’ (DCLC). Furthermore, it will be shown that the distributed computational strength of quantum communication will be justified in terms of a stronger version of this task, namely the ‘Delayed choice distributed computation with limited communication’ (DC2LC). The proposed task thus provides a new approach to operationally single out quantum theory in the theory-space and hence promises a novel perspective towards the axiomatic derivation of Hilbert space quantum mechanics.
References:
Phys. Rev. A (Rapid)100, 060101 (2019)
Ann. Phys.(Berlin)2020,532, 2000334 (2020)
arXiv:2012.05781 [quant-ph](2020)
Zoom link: https://pitp.zoom.us/j/92924188227?pwd=ODJYQXVoaUtzZmZIdFlmcUNIV3Rmdz09
Vojtěch Havlíček IBM (United States)
Kronecker coefficients appear in representation of the symmetric group in the decomposition of tensor products of irreducible representations. They are notoriously difficult to compute and it is a long standing problem to find a combinatorial expression for them.
We study the problem of computing Kronecker coefficients from quantum computational perspective. First, we show that the coefficients can be expressed as a dimension of a subspace given by intersection of two commuting, efficiently implementable projectors and relate their computation to the recently introduced quantum approximate counting class (QAPC). Using similar construction, we show that deciding positivity of Kronecker coefficients is contained in QMA. We give similar results for a related problem of approximating row sums in a character table of the symmetric group and show that its decision variant is in QMA. We then discuss two quantum algorithms - one that samples a distribution over squared characters and another one that approximates normalized Kronecker coefficients to inverse-polynomial additive error. We show that under a conjecture about average-case hardness of computing Kronecker coefficients, the resulting distribution is hard to sample from classically.
Our work explores new structures for quantum algorithms and improved characterization of the quantum approximate counting.
Joint work with David Gossett, Sergey Bravyi, Anirban Chowdhury and Guanyu Zhu
Zoom link: https://pitp.zoom.us/j/95976938016?pwd=eDV3TXZReHo5UHdvZ0hIbkhXOFcxQT09
Cihan Okay Bilkent University
Classical simulation algorithms provide a rigorous ground for investigating quantum resources responsible for quantum speedup. In my talk, I will consider one such algorithm provided by Lambda polytopes. These polytopes are defined to be the polar dual of the stabilizer polytopes and can be used to provide a hidden variable model for finite-dimensional quantum theory. This hidden variable model can be turned into a classical algorithm that can simulate any quantum computation. The efficiency of this algorithm depends on the combinatorial structure of the polytope. In general, which subset of the vertices gives rise to efficient simulation is an open problem. I will describe some of the known classes of vertices and available methods for studying this polytope.
Zoom link: https://pitp.zoom.us/j/95216680309?pwd=aGlIN2NtZVRtczdHcXl5RzgzQTlOdz09
Hsin-Yuan Huang California Institute of Technology (Caltech)
We present an efficient machine learning (ML) algorithm for predicting any unknown quantum process over n qubits. For a wide range of distributions D on arbitrary n-qubit states, we show that this ML algorithm can learn to predict any local property of the output from the unknown process, with a small average error over input states drawn from D. The ML algorithm is computationally efficient even when the unknown process is a quantum circuit with exponentially many gates. Our algorithm combines efficient procedures for learning properties of an unknown state and for learning a low-degree approximation to an unknown observable. The analysis hinges on proving new norm inequalities, including a quantum analogue of the classical Bohnenblust-Hille inequality, which we derive by giving an improved algorithm for optimizing local Hamiltonians. Overall, our results highlight the potential for ML models to predict the output of complex quantum dynamics much faster than the time needed to run the process itself.
Zoom link: https://pitp.zoom.us/j/93857777354?pwd=c044blZuQVhLS200ME4vN25uaGJudz09
Nicetu Tibau Vidal University of Oxford
In this talk, I present the latest works on anyonic information theory and how it is linked to aspects of quantum foundations. First, the theory of 2+1 D non-abelian anyons will be introduced. The newly discovered notion of anyonic creation operators will be presented, as well as their use as local elements of reality within the Deutsch-Hayden interpretation of quantum mechanics. Lastly, I will show strange properties of anyonic entanglement that appear due to the lack of a tensor product structure, such as the different spectra of marginals in bipartite systems. This property makes the Von Neumann entropy a bad entanglement measure. I will explain the challenges of defining entanglement measures for anyonic systems and current approaches.
Zoom link: https://pitp.zoom.us/j/99863263804?pwd=MUhkYTBzcUlwTmJ0Z3F4aFo3Rkt6QT09
Lorenzo Catani Chapman University
No-go theorems (Bell, Kochen-Specker, …) formally show the departure of quantum theory from classical theory. These are formulated in the framework of ontological models and, if one accepts such framework, entail that quantum theory involves problematic (“fine-tuned”) properties. I will argue that the lesson to take from the no-go theorems is to abandon the framework of ontological models as the way to model reality. I will analyze what I believe to be the unnatural assumptions of such framework and I will propose a way to change it. The basic principle of the new notion of reality I propose is that for something to exist is for something to be recorded. I will motivate the principle and explore its consequences. In order to implement such proposal into a precise theory-independent mathematical framework I will make use of point-free topological spaces (locales). I will discuss why this new proposal should be promising for understanding quantum theory and I will present several open questions.
Zoom link: https://pitp.zoom.us/j/91292006884?pwd=V2EzaEw5Z3NRUGd4cVdSRnlOOWFVZz09
Jacopo Surace Perimeter Institute for Theoretical Physics
In the context of irreversible dynamics, the meaning of the reverse of a physical evolution can be quite ambiguous. It is a standard choice to define the reverse process using Bayes' theorem, but, in general, this is not optimal with respect to the relative entropy of recovery. In this work we explore whether it is possible to characterise an optimal reverse map building from the concept of state retrieval maps. In doing so, we propose a set of principles that state retrieval maps should satisfy. We find out that the Bayes inspired reverse is just one case in a whole class of possible choices, which can be optimised to give a map retrieving the initial state more precisely than the Bayes rule. Our analysis has the advantage of naturally extending to the quantum regime. In fact, we find a class of reverse transformations containing the Petz recovery map as a particular case, corroborating its interpretation as a quantum analogue of the Bayes retrieval.
Finally, we present numerical evidence showing that by adding a single extra axiom one can isolate for classical dynamics the usual reverse process derived from Bayes' theorem.
Zoom link: https://pitp.zoom.us/j/93589286500?pwd=dkZuRzR0SlhVd1lPdGNOZWFYQWtRZz09
Indrajit Sen Chapman University
Non-normalizable quantum states are usually discarded as mathematical artefacts in quantum mechanics. However, such states naturally occur in quantum gravity as solutions to physical constraints. This suggests reconsidering the interpretation of such states. Some of the existing approaches to this question seek to redefine the inner product, but this arguably leads to further challenges.
In this talk, I will propose an alternative interpretation of non-normalizable states using pilot-wave theory. First, I will argue that the basic conceptual structure of the theory contains a straightforward interpretation of these states. Second, to better understand such states, I will discuss non-normalizable states of the quantum harmonic oscillator from a pilot-wave perspective. I will show that, contrary to intuitions from orthodox quantum mechanics, the non-normalizable eigenstates and their superpositions are bound states in the sense that the pilot-wave velocity field vy→0 at large ±y. Third, I will introduce a new notion of equilibrium, named pilot-wave equilibrium, and use it to define physically-meaningful equilibrium densities for such states. I will show, via an H-theorem, that an arbitrary initial density with compact support relaxes to pilot-wave equilibrium at a coarse-grained level, under assumptions similar to those for relaxation to quantum equilibrium. I will conclude by discussing the implications for pilot-wave theory, quantum gravity and quantum foundations in general.
Based on:
I. Sen. "Physical interpretation of non-normalizable harmonic oscillator states and relaxation to pilot-wave equilibrium" arXiv:2208.08945 (2022)
Zoom link: https://pitp.zoom.us/j/93736627504?pwd=VGtxZE5rTFdnT1dqZlFRWTFvWlFQUT09
Recently we have seen exciting results at the intersection of quantum foundations and the statistical analysis of causal hypotheses by virtue of the centrality of latent variable models to both fields.
In this workshop we will explore how academics from both sides can move the shared frontiers forward. Towards that end, we are including extensive breakout collaboration opportunities in addition to formal presentations. In order to make concrete progress on problems pertinent to both communities, we have selected the topic of causal models with restricted cardinality of the latent variables as a special focus for this workshop.
Sponsorship for this workshop has been provided by:
Territorial Land Acknowledgement
Perimeter Institute acknowledges that it is situated on the traditional territory of the Anishinaabe, Haudenosaunee, and Neutral peoples.
Perimeter Institute is located on the Haldimand Tract. After the American Revolution, the tract was granted by the British to the Six Nations of the Grand River and the Mississaugas of the Credit First Nation as compensation for their role in the war and for the loss of their traditional lands in upstate New York. Of the 950,000 acres granted to the Haudenosaunee, less than 5 percent remains Six Nations land. Only 6,100 acres remain Mississaugas of the Credit land.
We thank the Anishinaabe, Haudenosaunee, and Neutral peoples for hosting us on their land.
Indrajit Sen Chapman University
Superdeterminism has received a recent surge of attention in the foundations community. A particular superdeterministic proposal, named Invariant-set theory, appears to bring ideas from several diverse fields (eg. number theory, chaos theory etc.) to quantum foundations and provides a novel justification for the choice of initial conditions in terms of state-space geometry. However, the lack of a concrete hidden-variable model makes it difficult to evaluate the proposal from a foundational perspective.
In this talk, I will critically analyse this superdeterministic proposal in three steps. First, I will show how to build a hidden-variable model based on the proposal's ideas. Second, I will analyse the properties of the model and show that several arguments that appear to work in the proposal (on counter-factual measurements, non-commutativity etc.) fail when considered in the model. Further, the model is not only superdeterministic but also nonlocal, $\psi$-ontic and contains redundant information in its bit-string. Third, I will discuss the accuracy of the model in representing the proposal. I will consider the arguments put forward to claim inaccuracy and show that they are incorrect. My results lend further support to the view that superdeterminism is unlikely to solve the puzzle posed by the Bell correlations.
Based on the papers:
1. I. Sen. "Analysis of the superdeterministic Invariant-set theory in a hidden-variable setting." Proc. R. Soc. A 478.2259 (2022): 20210667.
2. I. Sen. "Reply to superdeterminists on the hidden-variable formulation of Invariant-set theory." arXiv:2109.11109 (2021).
Zoom link: https://pitp.zoom.us/j/99415427245?pwd=T3NOWUxKTENnMThRVEd3ZTRzU3ZKZz09