Format results
Correlators in integrable models with Separation of Variables
Nikolay Gromov King's College London
A 2020s Vision of CMB Lensing
Marius Millea University of California, Berkeley
Electric Multipole Insulators
Taylor Hughes University of Illinois Urbana-Champaign
Resource theories of communication
Hlér Kristjánsson Université de Montréal
Specification, Verification and Synthesis in Cyberphysical Systems
Ufuk Topcu (University of Texas at Austin)Towards Lorentzian quantum gravity via effective spin foams
Bianca Dittrich Perimeter Institute for Theoretical Physics
Minding the Gap: Lessons from LIGO-Virgo’s Biggest Black Holes
Maya Fishbach Canadian Institute for Theoretical Astrophysics (CITA)
Are we Living in the Matrix?
David Tong University of Cambridge
No. Obviously not. It's a daft question. But, buried underneath
this daft question is an extremely interesting one: is it possible to
simulate the known laws of physics on a computer? Remarkably, there is a
mathematical theorem, due to Nielsen and Ninomiya, that says the answer is
no. I'll explain this theorem, the underlying reasons for it, and some
recent work attempting to circumvent it.Proof Complexity
Sam Buss (UC San Diego)These videos provide an introduction to proof complexity, especially from the point of view of satisfiability algorithms. There are four videos. Part A introduces proof complexity, and discusses Frege proofs, abstract proof systems, resolution, and extended Frege proofs and extended resolution. Part B discusses the propositional pigeonhole principle, and upper and lower bounds on the complexity of proofs of the pigeonhole principle in the extended Frege proof system, the Frege proof systems, and resolution. Part C discusses the CDCL satisfiability algorithms from the point of view from proof complexity, including discussion of clause learning, trivial resolution, unit propagation, restarts, and RUP and (D)RAT proof traces. Part D discusses cutting planes, the Nullstellsatz and Polynomial Calculus proof systems and concludes with a short discussion of automatizability. Parts B and C are independent of each other. Part D has a modest dependency on Part B, but can also be watched independently.Correlators in integrable models with Separation of Variables
Nikolay Gromov King's College London
I will review recent progress in application of separation of variables method.
In particular I will review the construction for the integrable spin chains with gl(N) symmetry.
By finding, for the first time, the matrix elements of the SoV measure explicitly I will show how to compute various correlation functions and wave function overlaps in a simple determinant form.
General philosophy of application of these methods to the problems related to AdS/CFT, N=4 SYM etc. will be discussed too.
A 2020s Vision of CMB Lensing
Marius Millea University of California, Berkeley
With much of the cosmological information in the primary CMB having already been mined, the next decade of CMB observations will revolve around the secondary CMB lensing effect, which will touch nearly all aspects of observation in some way. At the same time, the increasingly low noise levels of these future observations will render existing "quadratic estimator" methods for analyzing CMB lensing obsolete. This leaves us in an exciting place where new methods need to be developed to fully take advantage of the upcoming generation of CMB data just on our doorstep. I will describe my work developing such new lensing analysis tool, made possible by Bayesian methods, modern statistical techniques, and borrowing ideas from machine learning. I will present the recent first-ever application of such methods to data (from the South Pole Telescope; https://arxiv.org/abs/2012.01709) and discuss prospects for this analysis in the future with regards to not just lensing but also primordial B modes, reionization, and extragalactic foreground fields.
SAT-Solving
Armin Biere (Johannes Kepler University)This tutorial focuses on explaining the most important aspects of the search loop in modern SAT solvers. It is an online BBC talk, i.e., black board and code, switching between a virtual black board to explain details and reviewing and using code in an interleaved manner. The code part features the new SAT Satch developed from scratch for this particular occasion. It is now available at https://github.com/arminbiere/satch. We start with an introduction on encoding problems into conjunctive normal form, the input format of SAT solvers, and then delve into search based complete algorithms for SAT solving, from DPLL to CDCL and all its modern concepts, including the implication graph, decision heuristics (VSIDS and VMTF), restarts, as well as clause data base reduction, and then end with a closer look at clause and watching data structures and how they are updated during boolean constraint propagation.Electric Multipole Insulators
Taylor Hughes University of Illinois Urbana-Champaign
In this talk I will present a general framework to distinguish different classes of charge insulators based on whether or not they insulate or conduct higher multipole moments (dipole, quadrupole, etc.). This formalism applies to generic many-body systems that support multipolar conservation laws. Applications of this work provide a key link between recently discovered higher order topological phases and fracton phases of matter.
SAT-Centered Complexity Theory
Valentine Kabanets (Simon Fraser University)From the early 1970s until now, SAT has been the central problem in Complexity Theory, inspiring many research directions. In the tutorial, I hope to show why SAT is such a favorite with complexity theorists, by talking about classical and modern results that involve SAT or its close relatives. We'll talk about NP-completeness, polynomial-time hierarchy, interactive proofs, PCPs, as well as (circuit) lower bounds, Exponential-Time Hypothesis, and learning.Resource theories of communication
Hlér Kristjánsson Université de Montréal
A series of recent works has shown that placing communication channels in a coherent superposition of alternative configurations can boost their ability to transmit information. Instances of this phenomenon are the advantages arising from the use of communication devices in a superposition of alternative causal orders, and those arising from the transmission of information along a superposition of alternative trajectories. The relation among these advantages has been the subject of recent debate, with some authors claiming that the advantages of the superposition of orders could be reproduced, and even surpassed, by other forms of superpositions. To shed light on this debate, we develop a general framework of resource theories of communication. In this framework, the resources are communication devices, and the allowed operations are (a) the placement of communication devices between the communicating parties, and (b) the connection of communication devices with local devices in the parties' laboratories. The allowed operations are required to satisfy the minimal condition that they do not enable communication independently of the devices representing the initial resources. The resource-theoretic analysis reveals that the aforementioned criticisms on the superposition of causal orders were based on an uneven comparison between different types of quantum superpositions, exhibiting different operational features.
Ref. https://iopscience.iop.org/article/10.1088/1367-2630/ab8ef7
Specification, Verification and Synthesis in Cyberphysical Systems
Ufuk Topcu (University of Texas at Austin)Cyberphysical systems are roughly characterized as systems enabled by coordination between computational and physical components and resources. They appear in a vast range of applications. Most applications of cyberphysical systems are subject to strict requirements for---to name a few---safety, security, and privacy. Formal methods for specification, verification, and synthesis have the potential to provide the languages, tools, and discipline necessary to meet these strict requirements. On the other hand, this potential can be realized only through proper connections between formal methods and several other fields. This tutorial will provide an overview of the complications in the context of cyberphysical systems that may benefit---and have benefited---from formal methods. It will provide examples of problems whose solution heavily relies on formal methods: correct-by-construction synthesis of hierarchical control protocols; synthesis of strategies under limitations on information availability; and verifiability of learning-enabled components.Unreasonable effectiveness of methods from theoretical computer science in quantum many-body physics
Anurag Anshu Harvard University
A central challenge in quantum many-body physics is a characterization of properties of `natural' quantum states, such as the ground states and Gibbs states of a local hamiltonian. The area-law conjecture, which postulates a remarkably simple structure of entanglement in gapped ground states, has resisted a resolution based on information-theoretic methods. We discuss how the right set of insights may come, quite unexpectedly, from polynomial approximations to boolean functions. Towards this, we describe a 2D sub-volume law for frustration-free locally-gapped ground states and highlight a pathway that could lead to an area law. Similar polynomial approximations have consequences for entanglement in Gibbs states and lead to the first provably linear time algorithm to simulate Gibbs states in 1D. Next, we consider the task of learning a Hamiltonian from a Gibbs state, where many-body entanglement obstructs rigorous algorithms. Here, we find that the effects of entanglement can again be controlled using tools from computer science, namely, strong convexity and sufficient statistics.
Towards Lorentzian quantum gravity via effective spin foams
Bianca Dittrich Perimeter Institute for Theoretical Physics
Euclidean quantum gravity approaches have a long history but suffer from a number of severe issues. This gives a strong motivation to develop Lorentzian approaches. Spin foams constitute an important such approach, which incorporate a rigorously derived discrete area spectrum. I will explain how this discrete area spectrum is connected to the appearance of an anomaly, which explains the significance of the Barbero-Immirzi parameter and forces an extension of the quantum configuration space, to also include torsion degrees of freedom. This can be understood as a defining characteristic of the spin foam approach, and provides a pathway to an (experimental) falsification.
All these features are captured in the recently constructive effective spin foam model, which is much more amenable to numerical calculations than previous models. I will present numerical results that a) show that spin foams do impose the correct equations of motion b) highlight the influence of the anomaly and c) underline the difference to Euclidean quantum gravity. I will close with an outlook on the features that can be studied with a truly Lorentzian model, e.g. topology change.
Minding the Gap: Lessons from LIGO-Virgo’s Biggest Black Holes
Maya Fishbach Canadian Institute for Theoretical Astrophysics (CITA)
Models for black hole formation from stellar evolution predict the existence of a pair-instability supernova mass gap in the range ~50 to ~120 solar masses. The binary black holes of LIGO-Virgo's first two observing runs supported this prediction, showing evidence for a dearth of component black hole masses above 45 solar masses. Meanwhile, among the 30+ new observations from the third observing run, there are several black holes that appear to sit above the 45 solar mass limit. I will discuss how these unexpectedly massive black holes fit into our understanding of the binary black hole population. The data are consistent with several scenarios, including a mass distribution that evolves with redshift and the possibility that the most massive binary black hole, GW190521, straddles the mass gap, containing an intermediate-mass black hole heavier than 120 solar masses.