Format results
Learning the quantum algorithm for state overlap
Lukasz Cincio Los Alamos National Laboratory
Counterfactual communication protocols
Lev Vaidman Tel Aviv University
Relativistic temperature gradients
Jessica Santiago Victoria University of Wellington
Models and Tests of Quantum Theory and Gravity
Adrian Kent University of Cambridge
Measures of Preparation Contextuality
Matthew Leifer Chapman University
Observables and (no) time in quantum gravity
Bianca Dittrich Perimeter Institute for Theoretical Physics
Convex Programming and Machine Learning in Quantum Information: Complementary Methods for Discovery and Verification
Convex optimization, linear and semidefinite programming in particular, has been a standard tool in quantum information theory, giving certificates of local and quantum correlations, contextuality, and more. Increasingly, similar methods are making headways in quantum many-body physics, giving lower bounds -- and thus certificates -- on the ground state energy. The disadvantage of such methods is that they do not scale well to large system sizes, whether those systems are multiparty Bell scenarios or lattice models of numerous sites. Machine
learning is entering the field as the latest buzzword. While it provides a more scalable alternative to convex programming and enables forming new conjectures, the outcome of learning methods remains uncertified. In this talk, I introduce the most important paradigms in machine learning for quantum information theory, give an overview of some earlier work in the field, argue for the importance of certifiable predictions of learning algorithms, and present some of our preliminary results.Expressiveness in Deep Learning via Tensor Networks and Quantum Entanglement - Nadev Cohen
Three fundamental factors determine the quality of a statistical learning algorithm: expressiveness, generalization and optimization. The classic strategy for handling these factors is relatively well understood. In contrast, the radically different approach of deep learning, which in the last few years has revolutionized the world of artificial intelligence, is shrouded by mystery. This talk will describe a series of works aimed at unraveling some of the mysteries revolving expressiveness, arguably the most prominent factor behind the success of deep learning. I will begin by showing that state of the art deep learning architectures, such as convolutional networks, can be represented as tensor networks -- a computational model commonly employed in quantum physics. This connection will inspire the use of quantum entanglement for defining measures of data correlations modeled by deep networks. Next, I will turn to a quantum max-flow / min-cut theorem characterizing the entanglement captured by tensor networks. This theorem will give rise to new results that shed light on expressiveness in deep learning, and in addition, provide new tools for deep network design.
Works covered in the talk were in collaboration with Yoav Levine, Or Sharir, David Yakira and Amnon Shashua.
Learning the quantum algorithm for state overlap
Lukasz Cincio Los Alamos National Laboratory
Short-depth algorithms are crucial for reducing computational error on near-term quantum computers, for which decoherence and gate infidelity remain important issues. Here we present a machine-learning inspired approach for discovering such algorithms. We apply our method to a ubiquitous primitive: computing the overlap Tr(rho*sigma) between two quantum states rho and sigma. The standard algorithm for this task, known as the Swap Test, is used in many applications such as quantum support vector machines, and, when specialized to rho=sigma, quantifies the Renyi entanglement. Here, we find algorithms that have shorter depths than the Swap Test, including one that has constant depth (independent of problem size). Furthermore, we apply our approach to the hardware-specific connectivity and gate alphabets used by Rigetti's and IBM's quantum computers and demonstrate that the shorter algorithms that we derive significantly reduce the error - compared to the Swap Test - on these computers.
Observers as Primitives
Nuriya Nurgalieva ETH Zurich
Let us suppose that we are trying to build a physical theory of the universe, in order to do so, we have to introduce some primitive notions, on which the theory will be based upon. We explore possible candidates that can be considered to be such "primitives": for example, the structure of the spacetime, or quantum states. However, the examples can be given such that show that these notions are not as objective as we would want them to be. The concept of objectivity, on the other hand, is closedly linked to that one of "an observer", thus, we can at least assign it as a primitive of the theory. Now agents are themselves physical systems, and we should take this into account when we specify the ground rules of what they can do. On the one hand, we take agents and their communication as a primitive of the theory and then see which concepts can be derived from there. On the other hand, we treat agents as quantum systems themselves and investigate what kind of logic applies to their interpersonal reasoning; for that, as a guiding example we use the Frauchiger-Renner thought experiment {1,2}.Counterfactual communication protocols
Lev Vaidman Tel Aviv University
Possibility to communicate between spatially separated regions, without even a single photon passing between the two parties, is an amazing quantum phenomenon. The possibility of transmitting one value of a bit in such a way, the interaction-free measurement, was known for quarter of a century. The protocols of full communication, including transmitting unknown quantum states were proposed only few years ago, but it was shown that in all these protocols the particle was leaving a weak trace in the transmission channel, the trace larger than the trace left by a single particle passing through the channel. However, a simple modification of these recent protocols eliminates the trace in the transmission channel and makes all these protocols truly counterfactual.Time-delocalized quantum subsystems and operations: on the existence of processes with indefinite causal structure in quantum mechanics
Ognyan Oreshkov Université Libre de Bruxelles
It was recently found that it is theoretically possible for there to exist higher-order quantum processes in which the operations performed by separate parties cannot be ascribed a definite causal order. Some of these processes are believed to have a physical realization in standard quantum mechanics via coherent control of the times of the operations. A prominent example is the quantum SWITCH, which was recently demonstrated experimentally. However, up until now, there has been no rigorous justification for the interpretation of such an experiment as a genuine realization of a process with indefinite causal structure as opposed to a simulation of such a process. Where exactly are the local operations of the parties in such an experiment? On what spaces do they act given that their times are indefinite? Can we probe them directly rather than assume what they ought to be based on heuristic considerations? How can we reconcile the claim that these operations really take place, each once as required, with the fact that the structure of the presumed process implies that they cannot be part of any acyclic circuit? Here, I offer a precise answer to these questions: the input and output systems of the operations in such a process are generally nontrivial subsystems of Hilbert spaces that are tensor products of Hilbert spaces associated with different times—a fact that is directly experimentally verifiable. With respect to these time-delocalized subsystems, the structure of the process is one of a circuit with a cycle, which cannot be reduced to a (possibly dynamical) probabilistic mixture of acyclic circuits. This provides, for the first time, a rigorous proof of the existence of processes with indefinite causal structure in quantum mechanics. I further show that all bipartite processes that obey a recently proposed unitary extension postulate, together with their unitary extensions, have a physical realization on such time-delocalized subsystems, and provide evidence that even more general processes may be physically admissible. These results unveil a novel structure within quantum mechanics, which may have important implications for physics and information processing.Relativistic temperature gradients
Jessica Santiago Victoria University of Wellington
Despite being broadly accepted nowadays, temperature gradients in thermal equilibrium states continue to cause confusion, since they naively seem to contradict the laws of classical thermodynamics. In this talk, we will explore the physical meaning behind this concept, specifically discussing the role played by the university of free fall. We will show that temperature, just like time, is an observer dependent quantity and discuss why gravity is the only force capable of causing equilibrium thermal gradients without violating any of the laws of thermodynamics. We will also demonstrate that significant care and delicacy are necessary when extending Tolman's results to distinct classes of heat baths in stationary spacetimes.
Models and Tests of Quantum Theory and Gravity
Adrian Kent University of Cambridge
Models that have some but not all features of standard quantum theory can be valuable in several ways, as Bell, Ghirardi-Rimini-Weber-Pearle, Hardy, Spekkens and many others have shown. One is to illuminate quantum theory and shed light on possible reaxiomatisations or reformulations. Another is to suggest experiments that might confirm some untested aspect of quantum theory or point the way to a new theory. I discuss here some models that combine quantum theory and gravity and experimental tests.From quantum to cognition in pictures.
Bob Coecke Quantinuum
For well over a decade, we developed an entirely pictorial (and formally rigorous!) presentation of quantum theory [*]. At the present, experiments are being setup aimed at establishing the age at which children could effectively learn quantum theory in this manner. Meanwhile, the pictorial language has also been successful in the study of natural language, and very recently we have started to apply it to model cognition, where we employ GPT-alike models. We present the key ingredients of the pictorial language language as well as their interpretation across disciplines. [*] B. Coecke & A. Kissinger (2017) Picturing Quantum Processes. A first course on quantum theory and diagrammatic reasoning. Cambridge University Press.A compositional approach to quantum functions, and the Morita theory of quantum graph isomorphisms
Dominic Verdon University of Oxford
Certain nonlocal games exhibiting quantum advantage, such as the quantum graph homomorphism and isomorphism games, have composable quantum strategies which are naturally interpreted as structure-preserving functions between finite sets. We propose a natural compositional framework for noncommutative finite set theory in which these quantum strategies appear naturally, and which connects nonlocal games with recent work on compact quantum groups. We apply Morita-theoretical machinery within this framework to characterise, classify, and construct quantum strategies for the graph isomorphism game. This is joint work with Benjamin Musto and David Reutter, based on the papers 1711.07945 and 1801.09705.Measures of Preparation Contextuality
Matthew Leifer Chapman University
In a large medical trial, if one obtained a ridiculously small p-value like 10^-12, one would typically move from a plain hypothesis test to trying to estimate the parameters of the effect. For example, one might try to estimate the optimal dosage of a drug or the optimal length of a course of treatment. Tests of Bell and noncontextuality inequalities are hypotheses tests, and typical p-values are much lower than this, e.g. 12-sigma effects are not unheard of and a 7-sigma violation already corresponds to a p-value of about 10^-12. Why then, in quantum foundations, are we still obsessed with proposing and testing new inequalities rather than trying to estimate the parameters of the effect from the experimental data? Here, we will try to do this for preparation contextuality, but will also make some related comments on recent loophole-free Bell inequality tests. We introduce two measures of preparation contextuality: the maximal overlap and the preparation contextuality fraction. The latter is linearly related to the degree of violation of a preparation noncontextuality inequality, so can be estimated from experimental data. Although the measures are different in general, they can be equal for proofs of preparation contextuality that have sufficient symmetry, such as the timelike analogue of the CHSH scenario. We give the value of these measures for this scenario. Using our result, we can consider pairty-epsilon multiplexing, Alice must try to communicate two bits to Bob so that he can choose to determine either of them with high probability, but where Alice must ensure that Bob cannot guess the parity of the bits with probability greater than 1/2 + epsilon, and determine the range of epislon for which there is still an advantage in preparation contextual theories. If time permits, I will make some brief comments on how to robustify experimental tests of this result. joint work with Eric Freda and David SchmidObservables and (no) time in quantum gravity
Bianca Dittrich Perimeter Institute for Theoretical Physics
I will explain the special requirements that observables have to satisfy in quantum gravity and how this affects deeply the notion of time. I will furthermore explore how the search for observables in classical gravity can inform the construction of a quantum theory of gravity.