Format results
Does ignorance of the whole imply ignorance of the parts?
Stephanie Wehner Delft University of Technology
Communication cost Vs Bell inequality violation
Marc Kaplan Université de Montréal
Guess your neighbor input
Antonio Acin Institute of Photonic Sciences (ICFO)
Uncertainty, nonlocality & complementarity
Jonathan Oppenheim University College London
Part 2: An introduction to the pure-spinor formalism for the superstring
Giuseppe Policastro École Normale Supérieure - PSL
Non-contextual correlations in probabilistic models
Andreas Winter University of Bristol
An introduction to the pure-spinor formalism for the superstring - Part 1
Giuseppe Policastro École Normale Supérieure - PSL
A Quantum-Digital Universe
David Deutsch re-formulated the Church-Turing thesis as a physical principle, asserting that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means". Such principle can be regarded as a new theoretical paradigm, whereby the entire Physics is emerging from a quantum computation. But for a theory to be a good one, it must explain a large class of phenomena based on few general principles. Taking as a general principle the topological homogeneity of the computational network with graph-dimension equal to the space-time dimension corresponds to replacing quantum field theory (QFT) with a numerable set of quantum systems in local interaction. This means to consider QFT as a kind of Fermi-scale thermodynamic" limit of a deeper Planck-scale theory, with the quantum field replaced by a giant quantum computer. In the talk, I will illustrate mechanisms of emergence of physics from the quantum computation in 1+1 dimensions. We will see that Dirac's is just the equation describing the free flow of information, leading to an informational definition of inertial mass and Planck constant. I will then illustrate the emergence mechanism of Minkowsian space-time from the computation, how the field Hamiltonian comes out, and how quantum fields are actually eliminated in favor of qubits. We will see that the digital nature of the field leads to an in-principle observable consequence in terms of a mass-dependent refraction index of vacuum, with the information becoming stationary at the Planck mass. Such refraction index of vacuum is a general phenomenon due to unitariety in the discrete, and can also help in solving the speed-of-light isotropy conundrum posed by digitalization of the field in more than 1 space dimensions. We will also see how the quantum nature of the processed information plays a crucial role in other practical informational issues, e.g. the possibility of driving the information in different directions, without the need of increasing the complexity of the circuit. Finally I will briefly comment about gravity as emergent from the quantum computation, and the connection with Verlinde-Jacobson approach.Does ignorance of the whole imply ignorance of the parts?
Stephanie Wehner Delft University of Technology
A central question in our understanding of the physical world is how our knowledge of the whole relates to our knowledge of the individual parts. One aspect of this question is the following: to what extent does ignorance about a whole preclude knowledge of at least one of its parts? Relying purely on classical intuition, one would certainly be inclined to conjecture that a strong ignorance of the whole cannot come without significant ignorance of at least one of its parts. Indeed, we show that this reasoning holds in any non-contextual hidden variable model (NC-HV). Curiously, however, such a conjecture is false in quantum theory: we provide an explicit example where a large ignorance about the whole can coexist with an almost perfect knowledge of each of its parts. More specifically, we provide a simple information-theoretic inequality satisfied in any NC-HV, but which can be arbitrarily violated by quantum mechanics. Our inequality has interesting implications for quantum cryptography.Generalised entropies, information causality, and non-local games
We will explore generalisations of the Shannon and von Neumann entropy to other probabilistic theories, and their connection to the principle of information causality. We will also investigate the link between information causality and non-local games, leading to a new quantum bound on computing the inner product non-locally.Communication cost Vs Bell inequality violation
Marc Kaplan Université de Montréal
In 1964, John Bell proved that independent measurements on entangled quantum states lead to correlations that cannot be reproduced using local hidden variables. The core of his proof is that such distributions violate some logical constraints known as Bell inequalities. This remarkable result establishes the non-locality of quantum physics. Bell's approach is purely qualitative. This naturally leads to the question of quantifying quantum physics' non-locality. We will specifically consider two quantities introduced for this purpose. The first one is the maximum amount of Bell inequality violation, and the second one is the communication cost of simulating quantum distributions. In this talk, we prove that these two quantities are strongly related: the logarithm of the first is upper bounded by the second. We prove this theorem in the more general context of non-signalling distributions. This generalization gives us two clear benefits. First, the rich structure of the underlying affine space provides us with a very strong intuition. Secondly, non-signalling distributions capture traditional communication complexity of boolean functions. In that case, our theorem is equivalent to the factorization norm lower bound of Linial and Shraibman, for which we give an elementary proof.Guess your neighbor input
Antonio Acin Institute of Photonic Sciences (ICFO)
We present âÂÂguess your neighbor inputâ (GYNI), a multipartite nonlocal task in which each player must guess the input received by his neighbor. We show that quantum correlations do not perform better than classical ones at this task, for any prior distribution of the inputs. There exist, however, input distributions for which general no-signalling correlations can outperform classical and quantum correlations. Some of the Bell inequalities associated to our construction correspond to facets of the local polytope. We then discuss implications of this game in connection with recent attempts of deriving quantum correlations from information based principles, such as non-trivial communication complexity, information causality and GleasonâÂÂs theorem. Our results show that truly multipartite concepts are necessary to obtain the set of quantum correlations for an arbitrary number of parties.Uncertainty, nonlocality & complementarity
Jonathan Oppenheim University College London
Part 2: An introduction to the pure-spinor formalism for the superstring
Giuseppe Policastro École Normale Supérieure - PSL
Higher loop amplitudes and non-minimal formalismHow Fundamental is the Uncertainty Principle?
Renato Renner ETH Zurich
According to quantum theory, it is impossible to prepare the state of a system such that the outcome of any projective measurement on the system can be predicted with certainty. This limitation of predictive power, which is known as the uncertainty principle, is one of the main distinguishing properties of quantum theory when compared to classical theories. In this talk, I will discuss the implications of this principle to foundational questions. In particular, I will consider the hypothesis that the uncertainty principle, rather than (only) telling us something about reality, may be seen as a manifestation of the limitations of our (classical) methods used to describe reality.Quantum information, the ambiguity of the past, and the complexity of the present
Charles Bennett IBM (United States)
Entanglement provides a coherent view of the physical origin of randomness and the growth and decay of correlations, even in macroscopic systems exhibiting few traditional quantum hallmarks. It helps explain why the future is more uncertain than the past, and how correlations can become macroscopic and classical by being redundantly replicated throughout a system's environment. The most private information, exemplified by a quantum eraser experiment, exists only transiently: after the experiment is over no record remains anywhere in the universe of what "happened". At the other extreme is information that has been so widely replicated as to be infeasible to conceal and unlikely to be forgotten. But such conspicuous information is exceptional: a comparison of entropy flows into and out of the Earth with estimates of the planet's storage capacity leads to the conclusion that most macroscopic classical information---for example the pattern of drops in last week's rainfall---is impermanent, eventually becoming nearly as ambiguous, from a terrestrial perspective, as the transient result of a quantum eraser experiment. Finally we discuss prerequisites for a system to accumulate and maintain in its present state, as our world does, a complex and redundant record of at least some features of its past. Not all dynamics and initial conditions lead to this behavior, and in those that do, the behavior itself tends to be temporary, with the system losing its memory, and even its classical character, as it relaxes to thermal equilibrium.Is the universe exponentially complicated? A no-go theorem for hidden variable interpretations of quantum theory.
Jonathan Barrett University of Oxford
The quantum mechanical state vector is a complicated object. In particular, the amount of data that must be given in order to specify the state vector (even approximately) increases exponentially with the number of quantum systems. Does this mean that the universe is, in some sense, exponentially complicated? I argue that the answer is yes, if the state vector is a one-to-one description of some part of physical reality. This is the case according to both the Everett and Bohm interpretations. But another possibility is that the state vector merely represents information about an underlying reality. In this case, the exponential complexity of the state vector is no more disturbing that that of a classical probability distribution: specifying a probability distribution over N variables also requires an amount of data that is exponential in N. This leaves the following question: does there exist an interpretation of quantum theory such that (i) the state vector merely represents information and (ii) the underlying reality is simple to describe (i.e., not exponential)? Adapting recent results in communication complexity, I will show that the answer is no. Just as any realist interpretation of quantum theory must be non-locally-causal (by Bell's theorem), any realist interpretation must describe an exponentially complicated reality.Non-contextual correlations in probabilistic models
Andreas Winter University of Bristol
Non-contextuality is presented as an abstraction and at the same time generalisation of locality. Rather than in correlations, the underlying physical model leaves its signature in collections of expectation values, which are contrained by inequalities much like Bell's or Tsirelson's inequalities. These non-contextual inequalities reveal a deep connection to classic topics in graph theory, such as independence numbers, Lovasz numbers and other graph parameters. By considering the special case of bi-local experiments, we arrive at a semidefinite relaxation (and indeed a whole hierarchy of such relaxations) for the problem of determining the maximum quantum violation of a given Bell inequality.An introduction to the pure-spinor formalism for the superstring - Part 1
Giuseppe Policastro École Normale Supérieure - PSL
Pure spinors, BRST cohomology and tree-level amplitudes