Format results
Decoherence: Out with States, In with Causation
Nicholas Ormrod Perimeter Institute for Theoretical Physics
Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
Lecture - Quantum Field Theory I (Core), PHYS 601
Gang Xu Perimeter Institute for Theoretical Physics
Lecture - Statistical Physics (Core), PHYS 602
Naren Manjunath
Wavefunction branches demand a definition!
Jess Riedel NTT Research
Under unitary evolution, a typical macroscopic quantum system is thought to develop wavefunction branches: a time-dependent decomposition into orthogonal components that (1) form a tree structure forward in time, (2) are approximate eigenstates of quasiclassical macroscopic observables, and (3) exhibit effective collapse of feasibly measurable observables. If they could be defined precisely, wavefunction branches would extend the theory of decoherence beyond the system-environment paradigm and could supplant anthropocentric measurement in the quantum axioms. Furthermore, when such branches have bounded entanglement and can be effectively identified numerically, sampling them would allow asymptotically efficient classical simulation of quantum systems. I consider a promising recent approach to formalizing branches on the lattice by Taylor & McCulloch [Quantum 9, 1670 (2025), arXiv:2308.04494], and compare it to prior work from Weingarten [Found. Phys. 52, 45 (2022), arXiv:2105.04545]. Both proposals are based on quantum complexity and argue that, once created, branches persist for long times due to the generic linear growth of state complexity. Taylor & McCulloch characterize branches by a large difference in the unitary complexity necessary to interfere vs. distinguish them. Weingarten takes branches as the components of the decomposition that minimizes a weighted sum of expected squared complexity and the Shannon entropy of squared norms. I discuss strengths and weaknesses of these approaches, and identify tractable open questions.Decoherence: Out with States, In with Causation
Nicholas Ormrod Perimeter Institute for Theoretical Physics
I introduce a modern perspective on decoherence informed by quantum causal modelling, situate it in its historical context, and show how it resolves two long-standing problems in traditional approaches. Decoherence is often told as a story about states, focused on the suppression of off-diagonal terms in a density matrix through correlation with an environment. Yet this sits uneasily with a key observation already in Zurek (1981): the unitary dynamics alone determine the preferred basis. Recent advances in quantum causal modelling enable a genuinely dynamics-first account: define decoherence directly in terms of causal influence, formalized as noncommutation relations that specify which generators can affect which observables. On this view, diagonal density matrices, correlations, and other state-level features are mere symptoms of decoherence, while decoherence itself is a property of the unitary dynamics. A major payoff of this causal account is that, rather than designating one piece of the universe as the “system” and the rest as the “environment,” one can treat decoherence more democratically, allowing different systems to serve as environments for each other. In turn, this democratic perspective yields a unique consistent set of histories for any subset of unitarily interacting subsystems, and thus addressing the critique of consistent histories by Dowker and Kent (1994) by separating physically meaningful histories from uninformative ones.The Quality/Cosmology Tension for a Post-Inflation QCD Axion
The QCD axion is not only a leading contender as a solution to the Strong CP problem, it is also a natural dark matter candidate. In particular, the post-inflationary axion is particularly attractive because of the possibility of a unique prediction of the axion mass if axion makes up all of dark matter. On the other hand, QCD axion models often suffers from the so-called axion quality problem, where the axion shift symmetry is unprotected from quantum gravity effects. In this talk I will discuss the tension between solutions to the quality problem and viable cosmology for post-inflationary QCD axions. I will start with a simple Z_N solution as an illustrative example of the close connection between the quality problem and the domain wall problem. I will then analyze proposals in the literature that involve more complex symmetry structure and show that they share a set of cosmological issues. Our study suggests that a viable post-inflationary QCD axion model is likely to have a non-standard cosmological history which could significantly impact the prediction of the "correct" axion mass.
Trinity: the Dark Matter Halo—Galaxy—Supermassive Black Hole (SMBH) Connection from z=0-10
Haowen ZhangSupermassive black holes (SMBHs) exist in many galaxies. Their growth is accompanied by strong energy output, which is capable of regulating host galaxy evolution. Understanding SMBH growth is thus critical for studying galaxy formation and evolution. However, it has been difficult to quantify SMBH growth in different galaxies and cosmic epochs. In this talk, I will present Trinity, an empirical technique to determine the typical SMBH mass and growth rate in different galaxies and dark matter halos from z=0-10. I will discuss how the galaxy—SMBH connection from Trinity will help observational astronomers extract more information from data, as well as theoretical astronomers create better simulations of galaxy evolution. In addition, I will give an overview of the Trinity predictions that match latest JWST data, as well as those that do not match observations. Finally, I will talk about how observations with next-generation telescopes will enable a better understanding of the galaxy—SMBH connection, and how Trinity will be helpful for quantifying the constraining power of future observations.
Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
50 years of Black Hole evaporation
Bill Unruh"In 1974, Stephen Hawking predicted that a black hole, formed by the collapse of matter like a star, should not be black, as seemed to be the prediction since the surface is an outward going null ""shell"" from which nothing can escape, but rather should emit a thermal bath of radiation with a temperature inversely proportional to the mass of the black hole. But where does this radiation come from. It cannot be from the inside of the black hole, since it would have to travel faster than light to do so? At almost the same time I predicted that in the vacuum in flat spacetime, an accelerated ""detector"" (atom,photon counter, geiger counter,...) should respond as if surrounded by a thermal bath whose temperature was proportional to its acceleration. This turned out to be closely related to Hawking's result. I will present a very personal history of the past 50 years as we have tried to understant this quantum phenomenon. What is the orgin of Black hole themodynamics? 50 years later this is still one of the big quetions in the overlap between quantum field theory and gravity."Lecture - Quantum Field Theory I (Core), PHYS 601
Gang Xu Perimeter Institute for Theoretical Physics
Varieties of Rigour: Charting the Landscape of Reformulation Programmes in 1950 Quantum Field Theory
James FraserHistorians and philosophers of quantum field theory (QFT) face a distinctive challenge: the existence of multiple, mathematically and conceptually divergent formulations of the theory. Discussions of this issue in the extant literature often contrast the mainstream perturbative formalism—empirically powerful but mathematically dubious—with axiomatic QFT—mathematically rigorous but largely detached from empirical predictions. This paper complicates this dichotomy by drawing attention to a parallel tradition, beginning in the 1950s, that sought to render perturbative QFT itself more rigorous. One way to think about this is that there is considerable diversity even within the “mathematical QFT” camp. In order to account for this, I suggest that the axiomatic QFT and causal perturbation theory traditions can be understood as adopting distinct ideals of mathematical rigour: a global and a local conception, respectively.Wigner's Friends and Relations
Matt Leifer"Wigner's Friend is a variant of the Schrödinger's cat experiment in which the cat is replaced by a Friend, the difference being that the Friend is unambiguously a conscious agent who experiences definite measurement outcomes. It has received renewed attention in recent years due to the development of several Extended Wigner's Friend (EWF) arguments, aimed at ruling out certain kinds of Copenhagenish interpretations. In this talk, I will discuss the origins of Wigner's Friend in von Neumann and Wigner's discussions of what we now call the Orthodox interpretation of quantum mechanics. I will then discuss how the argument was taken up by Everett and his followers, generalized by Deutsch, and then more recently brought into the Copenhagenish context. Along the way, I emphasize how each variant of the argument has different implications for the Orthodox, Everett and Copenhagenish interpretations, concluding that all three interpretations require a theory of quantum mechanical agents in order to be complete. I outline some first steps towards such a theory."The Conceptual Development of Early Quantum Theory
A. Douglas StoneI will review conceptual advances which paved the way for the emergence of the mature form of quantum theory (quantum mechanics) in 1925-27, focusing on the contributions of Albert Einstein [1,2]. I argue that Einstein’s 1905 paper on light quanta was motivated by his firm belief that equipartition of energy was inescapable within classical statistical mechanics. Moreover, his rejection of the ether in his work on Special Relativity freed physicists to accept the possibility of wavelike phenomena not supported by a medium. Einstein in his 1907 paper on the specific heat of solids, became the first important physicist to embrace clearly the quantization of energy, strengthening his conclusion that a radical revolution in all of physics (not just electromagnetism) was upon us. In 1909 Einstein was able to derive the first rigorous result in quantum statistical mechanics, his energy/momentum fluctuation formula, which strongly supported the necessity of wave/particle duality in the new physics. Bohr in 1913 was able to explain the hydrogen spectrum in terms of quantization of electron orbits, but at the expense of accepting the possibility of accelerating charges which do not radiate, a result that Einstein found puzzling. In 1916-7 Einstein introduced fundamental randomness into quantum theory via the hypothesis of spontaneous emission. Shortly thereafter he generalized the Bohr-Sommerfeld quantization rules by putting them in a topological form but noticed that they didn’t seem to work for what we now call chaotic systems [3]. When Heisenberg eventually replaced the semiclassical quantization rules with a discrete generalization in matrix mechanics in 1925 he unwittingly escaped this limitation, something which does not appear to have been appreciated in the historical literature. Independently of these developments, De Broglie and Bose inspired Einstein in 1925 to develop a quantum statistical mechanics of indistinguishable atoms with wave-particle duality. This work directly motivated Schrodinger to investigate a wave equation describing electrons and eventually to discover his equation for the complex wave function of the electron at the beginning of 1926, which resolved the puzzle of non-radiating charges in hydrogenHertha Sponer, Maven of Quantum Spectroscopy
Elise CrullAs is well known to historians, in the early days of quantum theory James Franck frequently reported fresh experimental results to Bohr and others, on the basis of which major theoretical advances were made. These data – and indeed, the design and execution of the ground-breaking experimental work whence they came – were not, contrary to the usual assumptions, due to Franck himself. This work was done by Hertha Sponer; she was the one running the spectroscopy labs in Göttingen in these years, as well as teaching the main physics seminars in these areas. It was her experimental prowess that enabled significant insights into this new theory, and her expert instruction that guided and inspired a new generation of quantum physicists in Göttingen and beyond. Yet Sponer’s name has been nearly completely erased from this history. If she is mentioned at all it is usually as Franck’s “student” or “assistant” (of which she was neither – she was his academic equal) or as Franck’s second wife (as she was, but which entirely disregards her international standing as a scientist in her own right). Extant accounts of Sponer’s life and work are few and exclusively concern her post-WWII years as a professor of physics at Duke. But Sponer was no longer working at the cutting-edge of quantum theory in these decades, and so her role in that field’s development is left largely ignored. This talk reintroduces Sponer to the early history of quantum physics in her (arguably) rightful role, maven of quantum spectroscopy. I shall present two cases where I believe she earns this title: her early understanding and experimental confirmation of electron waves, and together with Franck first using quantum tunneling to interpret hitherto unexplained molecular phenomena.Lecture - Statistical Physics (Core), PHYS 602
Naren Manjunath