Format results
Tidal heating: a hunt for the horizon
Sayak Datta IUCAA - The Inter-University Centre for Astronomy and Astrophysics
Pseudo-Boolean Solving and Optimization
Jakob Nordström (University of Copenhagen & Lund University)Correlators in integrable models with Separation of Variables
Nikolay Gromov King's College London
A 2020s Vision of CMB Lensing
Marius Millea University of California, Berkeley
Electric Multipole Insulators
Taylor Hughes University of Illinois Urbana-Champaign
Resource theories of communication
Hlér Kristjánsson Université de Montréal
Specification, Verification and Synthesis in Cyberphysical Systems
Ufuk Topcu (University of Texas at Austin)
Einstein's Equivalence principle for superpositions of gravitational fields
Flaminia Giacomini ETH Zurich
The Principle of Equivalence, stating that all laws of physics take their special-relativistic form in any local inertial frame, lies at the core of General Relativity. Because of its fundamental status, this principle could be a very powerful guide in formulating physical laws at regimes where both gravitational and quantum effects are relevant. However, its formulation implicitly presupposes that reference frames are abstracted from classical systems (rods and clocks) and that the spacetime background is well defined. Here, we we generalise the Einstein Equivalence Principle to quantum reference frames (QRFs) and to superpositions of spacetimes. We build a unitary transformation to the QRF of a quantum system in curved spacetime, and in a superposition thereof. In both cases, a QRF can be found such that the metric looks locally flat. Hence, one cannot distinguish, with a local measurement, if the spacetime is flat or curved, or in a superposition of such spacetimes. This transformation identifies a Quantum Local Inertial Frame. These results extend the Principle of Equivalence to QRFs in a superposition of gravitational fields. Verifying this principle may pave a fruitful path to establishing solid conceptual grounds for a future theory of quantum gravity.
Tidal heating: a hunt for the horizon
Sayak Datta IUCAA - The Inter-University Centre for Astronomy and Astrophysics
The defining feature of a classical black hole horizon is that it is a "perfect absorber". Any evidence showing otherwise
would indicate a departure from the standard black-hole picture. Due to the presence of the horizon, black holes in binaries exchange
energy with their orbit, which backreacts on the orbit. This is called tidal heating. Tidal heating can be used to test the presence of a horizon.
I will discuss the prospect of tidal heating as a discriminator between black holes and horizonless compact objects, especially supermassive ones, in LISA.
I will also discuss a similar prospect for distinguishing between neutron stars and black holes in the LIGO band.
Pseudo-Boolean Solving and Optimization
Jakob Nordström (University of Copenhagen & Lund University)Pseudo-Boolean solving is the task of finding a solution to a collection of (linear) pseudo-Boolean constraints, also known as a 0-1 integer linear program, possibly optimizing some linear objective function. This problem can be approached using methods from conflict-driven clause learning (CDCL) SAT solving as in MaxSAT solvers, or "native" pseudo-Boolean reasoning based on variants of the cutting planes proof system, or (mixed) integer linear programming (MIP). The purpose of this tutorial is to provide a brief survey of pseudo-Boolean reasoning, MaxSAT solving, and integer linear programming, focusing on the potential for synergy between these different approaches and highlighting many open problems on both the applied and the theoretical side.Are we Living in the Matrix?
David Tong University of Cambridge
No. Obviously not. It's a daft question. But, buried underneath
this daft question is an extremely interesting one: is it possible to
simulate the known laws of physics on a computer? Remarkably, there is a
mathematical theorem, due to Nielsen and Ninomiya, that says the answer is
no. I'll explain this theorem, the underlying reasons for it, and some
recent work attempting to circumvent it.Proof Complexity
Sam Buss (UC San Diego)These videos provide an introduction to proof complexity, especially from the point of view of satisfiability algorithms. There are four videos. Part A introduces proof complexity, and discusses Frege proofs, abstract proof systems, resolution, and extended Frege proofs and extended resolution. Part B discusses the propositional pigeonhole principle, and upper and lower bounds on the complexity of proofs of the pigeonhole principle in the extended Frege proof system, the Frege proof systems, and resolution. Part C discusses the CDCL satisfiability algorithms from the point of view from proof complexity, including discussion of clause learning, trivial resolution, unit propagation, restarts, and RUP and (D)RAT proof traces. Part D discusses cutting planes, the Nullstellsatz and Polynomial Calculus proof systems and concludes with a short discussion of automatizability. Parts B and C are independent of each other. Part D has a modest dependency on Part B, but can also be watched independently.Correlators in integrable models with Separation of Variables
Nikolay Gromov King's College London
I will review recent progress in application of separation of variables method.
In particular I will review the construction for the integrable spin chains with gl(N) symmetry.
By finding, for the first time, the matrix elements of the SoV measure explicitly I will show how to compute various correlation functions and wave function overlaps in a simple determinant form.
General philosophy of application of these methods to the problems related to AdS/CFT, N=4 SYM etc. will be discussed too.
A 2020s Vision of CMB Lensing
Marius Millea University of California, Berkeley
With much of the cosmological information in the primary CMB having already been mined, the next decade of CMB observations will revolve around the secondary CMB lensing effect, which will touch nearly all aspects of observation in some way. At the same time, the increasingly low noise levels of these future observations will render existing "quadratic estimator" methods for analyzing CMB lensing obsolete. This leaves us in an exciting place where new methods need to be developed to fully take advantage of the upcoming generation of CMB data just on our doorstep. I will describe my work developing such new lensing analysis tool, made possible by Bayesian methods, modern statistical techniques, and borrowing ideas from machine learning. I will present the recent first-ever application of such methods to data (from the South Pole Telescope; https://arxiv.org/abs/2012.01709) and discuss prospects for this analysis in the future with regards to not just lensing but also primordial B modes, reionization, and extragalactic foreground fields.
SAT-Solving
Armin Biere (Johannes Kepler University)This tutorial focuses on explaining the most important aspects of the search loop in modern SAT solvers. It is an online BBC talk, i.e., black board and code, switching between a virtual black board to explain details and reviewing and using code in an interleaved manner. The code part features the new SAT Satch developed from scratch for this particular occasion. It is now available at https://github.com/arminbiere/satch. We start with an introduction on encoding problems into conjunctive normal form, the input format of SAT solvers, and then delve into search based complete algorithms for SAT solving, from DPLL to CDCL and all its modern concepts, including the implication graph, decision heuristics (VSIDS and VMTF), restarts, as well as clause data base reduction, and then end with a closer look at clause and watching data structures and how they are updated during boolean constraint propagation.Electric Multipole Insulators
Taylor Hughes University of Illinois Urbana-Champaign
In this talk I will present a general framework to distinguish different classes of charge insulators based on whether or not they insulate or conduct higher multipole moments (dipole, quadrupole, etc.). This formalism applies to generic many-body systems that support multipolar conservation laws. Applications of this work provide a key link between recently discovered higher order topological phases and fracton phases of matter.
SAT-Centered Complexity Theory
Valentine Kabanets (Simon Fraser University)From the early 1970s until now, SAT has been the central problem in Complexity Theory, inspiring many research directions. In the tutorial, I hope to show why SAT is such a favorite with complexity theorists, by talking about classical and modern results that involve SAT or its close relatives. We'll talk about NP-completeness, polynomial-time hierarchy, interactive proofs, PCPs, as well as (circuit) lower bounds, Exponential-Time Hypothesis, and learning.Resource theories of communication
Hlér Kristjánsson Université de Montréal
A series of recent works has shown that placing communication channels in a coherent superposition of alternative configurations can boost their ability to transmit information. Instances of this phenomenon are the advantages arising from the use of communication devices in a superposition of alternative causal orders, and those arising from the transmission of information along a superposition of alternative trajectories. The relation among these advantages has been the subject of recent debate, with some authors claiming that the advantages of the superposition of orders could be reproduced, and even surpassed, by other forms of superpositions. To shed light on this debate, we develop a general framework of resource theories of communication. In this framework, the resources are communication devices, and the allowed operations are (a) the placement of communication devices between the communicating parties, and (b) the connection of communication devices with local devices in the parties' laboratories. The allowed operations are required to satisfy the minimal condition that they do not enable communication independently of the devices representing the initial resources. The resource-theoretic analysis reveals that the aforementioned criticisms on the superposition of causal orders were based on an uneven comparison between different types of quantum superpositions, exhibiting different operational features.
Ref. https://iopscience.iop.org/article/10.1088/1367-2630/ab8ef7
Specification, Verification and Synthesis in Cyberphysical Systems
Ufuk Topcu (University of Texas at Austin)Cyberphysical systems are roughly characterized as systems enabled by coordination between computational and physical components and resources. They appear in a vast range of applications. Most applications of cyberphysical systems are subject to strict requirements for---to name a few---safety, security, and privacy. Formal methods for specification, verification, and synthesis have the potential to provide the languages, tools, and discipline necessary to meet these strict requirements. On the other hand, this potential can be realized only through proper connections between formal methods and several other fields. This tutorial will provide an overview of the complications in the context of cyberphysical systems that may benefit---and have benefited---from formal methods. It will provide examples of problems whose solution heavily relies on formal methods: correct-by-construction synthesis of hierarchical control protocols; synthesis of strategies under limitations on information availability; and verifiability of learning-enabled components.