Format results
The Planimeter and Contact Transformation: A Perfect Embodiment of the Weyl-Heisenberg Group and Canonical Transformation's Lost Twin Sister
Christopher Jackson Perimeter Institute for Theoretical Physics
Heisenberg’s Engagement with Plato and Malebranche on the path to 1925 Matrix Mechanics
Christoph Gallus THM (Technische Hochule Mittelhessen), JLU (Justus-Liebig Universität)
Delimiting Limits: Quantum-Classical Relations, hbar, and Decoherence
Alexander Franklin King's College London
Who Is Integrable? The Landscape of Integrable Spin Chains
Ana Retore Deutsches Elektronen-Synchrotron DESY
Decoherent Histories Contextuality
Thomas De Saegher University of Western Ontario
Decoherence: Out with States, In with Causation
Nicholas Ormrod Perimeter Institute for Theoretical Physics
Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
An information-based approach to quantum mechanics: John A. Wheeler at the Center for Theoretical Physics, UT Austin (1976-1986)
Silvia Castillo Vergara IHPST, University of Toronto
At the 1981 Physics of Computation Conference, held at MIT’s Endicott House, **John A. Wheeler** presented a paper entitled **“The Computer and the Universe”** where he put forward the idea of one day understanding ‘physics as information’ (Wheeler 1982). By the early 1980s, it was already well established that information is inseparably tied to physical degrees of freedom and therefore subject to physical law. Yet, several conference presentations advanced the counterpart idea: that physical laws themselves can be viewed as algorithms for information processing, and that their ultimate form must respect the limits of physically executable computation (see Landauer 1982; Fredkin 1982; Kantor 1982; Zuse 1982). After a distinguished career at Princeton, Wheeler joined the University of Texas at Austin in 1976, where he founded the Center for Theoretical Physics. His time there offers a perspective on how the informational turn, highlighted at the 1981 conference, influenced the trajectory of quantum mechanics research in the latter half of the twentieth century. During his time at Texas, Wheeler became intensely focused on the question, **“How come the quantum?”** He sought a deeper principle from which the quantum formalism could be derived. In this search, quantum theory became increasingly intertwined with computing and information theory. As director, he assembled a diverse group of graduate students, postdoctoral fellows, and visiting researchers, creating a dynamic research environment to help tackle this question. The group engaged deeply with emerging developments in computer science—reading widely on parallel computing, Turing machines, cellular automata, and cybernetics—and often held informal seminars where these topics converged with foundational issues in physics, particularly debates on the many-worlds interpretation (Everett, 1957). For Wheeler, the informational turn culminated in his proposal of **“it from bit,”** the idea that every physical entity—every “it”—derives its meaning from fundamental units of information, or “bits” (Wheeler, 1989). For many of the researchers who passed through Austin, this perspective translated into concrete efforts that laid the conceptual and technical groundwork for what would later be recognized as quantum information theory, quantum computation, and novel approaches to the foundations of quantum mechanics. This influence is particularly evident in the work of **David Deutsch** (quantum Turing machines), **William Wootters** (quantum distinguishability and teleportation), **Wojciech Zurek** (decoherence), and **Benjamin Schumacher** (quantum coding), with Wootters and Schumacher together introducing the concept of the qubit. The decade Wheeler spent at Austin provides a fresh perspective on the expanding role of computation and information theory in shaping modern physics in the late twentieth century. Computers were not merely pragmatic tools; they became conceptual models and heuristic devices that that helped steer the direction of research in quantum physics.The Planimeter and Contact Transformation: A Perfect Embodiment of the Weyl-Heisenberg Group and Canonical Transformation's Lost Twin Sister
Christopher Jackson Perimeter Institute for Theoretical Physics
Once Heisenberg unlocked the Bohr frequency condition and the canonical commutation relation came out, Quantum Mechanics hit the ground running. In the rush of it all, it’s not clear to me who knew then (and who knows now) that the non-commutativity of phase space displacement is in fact an idea that has been around for at least as long as Jacobi, the father of canonical transformation theory. First conceived of in 1818 and then patented in 1854, the Amsler planimeter is a measuring instrument, known to even Maxwell, that in fact operates on exactly the same commutation relation, hiding in plain sight. Meanwhile, Lie's 1880 theory of transformation groups was also founded on exactly the same structure, what he called the contact element. Who knew? Why aren’t more physicists aware of this? Join me as we explore Poincare’s sudden death, the origins of the Gruppenpest, and Hilbert’s declining health when physicists like Wigner needed him more than ever.Heisenberg’s Engagement with Plato and Malebranche on the path to 1925 Matrix Mechanics
Christoph Gallus THM (Technische Hochule Mittelhessen), JLU (Justus-Liebig Universität)
Heisenberg’s 1925 Matrizenmechanik (matrix mechanics) can be seen as a radical paradigm shift. While established scientists such as Bohr and Sommerfeld worked within the paradigm that electrons must be in a definite position at a definite time, the young Heisenberg abandoned these notions for electrons inside the atom. He wrote: > *“… in this situation it seems more advisable to give up completely any > hope of an observation of hitherto unobservable quantities (such as > position, orbital time of the electron) …”* > *“… it was not possible to assign a point in space (as a function of > time) to an electron by means of observable quantities …”* (W. Heisenberg, Über quantentheoretische Umdeutungen kinematischer und mechanischer Beziehungen, Zeitschrift für Physik, 33, pp. 880–881; received 29 July 1925.) Even radical ideas rarely occur in isolation. It is argued that Heisenberg’s early exposure to Plato’s philosophy and his discussions with friends on philosophical themes shaped by the Judeo-Christian tradition created fertile ground for his bold 1925 insight. The value of classical education was deeply rooted in his family: his father, a professor of Greek Philology, and his maternal grandfather, Nikolaus Wecklein—a noted scholar of Greek Sophists—ensured that classical education was a central pillar of his upbringing. Growing up in Würzburg and Munich, he read Latin and Greek, studied Kant’s Critique of Pure Reason at 16, and read Plato’s *Phaidon* and *Apology* in the original language. At 17, Plato’s *Timaios* introduced him to Greek atomist theory from a primary source. Lesser known is his exposure to French philosopher Malebranche, acknowledged when he later wrote: > “Robert’s remark about Malebranche had made it clear to me that > experience about atoms can only be of a rather indirect kind and that > atoms are likely not things.” (W. Heisenberg, Der Teil und das Ganze: Gespräche im Umkreis der Atomphysik, p. 21.) These two streams of thought may have aided Heisenberg's 1925 discovery because 1. they avoid a framework presupposing the independent existence of objects with definite positions at definite times, and 2. they offer a philosophy in which mathematics and abstract ideas can provide genuine insights about the world.Delimiting Limits: Quantum-Classical Relations, hbar, and Decoherence
Alexander Franklin King's College London
That there’s a sense in which classical physics reduces to and emerges from quantum physics is relatively uncontroversial but this talk will introduce much needed clarity on the role of the $\hbar$ limit for such inter-theoretic relations. This talk divides into two parts: I'll consider how classical physics emerges from quantum physics via decoherence, and I'll articulate the role for the $\hbar\rightarrow0$ limit. First I'll argue that decoherence provides the unique and interpretation-neutral way to understand the emergence of classical from quantum physics, I'll do this by reference to an account of emergence, and I'll argue that this account leaves interpretations with the more metaphysical task of articulating how ontology supervenes upon the decohered quantum description. Decoherence entails a form of dynamical independence between the branches of a superposition in the relevant basis to a very good degree of approximation. This means that the evolution in any such branch is screened off from the other branches and from the peculiarly quantum effects. The upshot of this is that in specific contexts classical dynamics are instantiated in fundamentally quantum systems. I'll claim that decoherence is thus responsible for the emergence of classical behaviour. Second I'll consider the role of the $\hbar$ limit. I’ll mention the history going back to Bohr and Dirac (see Bokulich (2008)) but focus on more recent analyses, especially Feintzeig’s work (though see also Landsman (2017)). Feintzeig’s rigorous treatise establishes a relation between the algebras of classical and quantum mechanics: he demonstrates that as $\hbar$ goes to zero, non-commuting observables commute. He argues that this is a reduction of classical by quantum mechanics, in the sense that it is “an explanation of the success of classical mechanics on the basis of quantum mechanics”. However, I’ll demonstrate that this limit is neither necessary nor sufficient for such explanations of classicality. First there are circumstances where classicality is instantiated but $\hbar$ is not small relative to the action; second there are circumstances where $\hbar$ is small relative to the action but classical behaviour is not instantiated. I discuss alternative accounts of what the $\hbar$ limit achieves and suggest that it may provide an empirical grounding for quantum mechanics in the sense that it tells us how to understand the theory, rather than informing us of the relation between the theory and its predecessors. I’ll conclude with some more general reasons why the distinction between the roles of decoherence and the $\hbar$ limit should be expected: limits are rather blunt tools in that they don’t provide the context-specificity that one finds in decoherence theory. Given that classical mechanics is only instantiated in certain contexts rather than brutely at large energy or timescales, the explanation of its success ought to depend on something more subtle than a limiting procedure. Bokulich, Alisa (2008). Reexamining the Quantum-Classical Relation: Beyond Reductionism and Pluralism. Cambridge University Press. Feintzeig, Benjamin H (2022). The classical–quantum correspondence. Cambridge University Press. Landsman, Klaas (2017). Foundations of Quantum Theory: From Classical Concepts to Operator Algebras. Springer Open.Who Is Integrable? The Landscape of Integrable Spin Chains
Ana Retore Deutsches Elektronen-Synchrotron DESY
Decoherent Histories Contextuality
Thomas De Saegher University of Western Ontario
A consistent set of histories for a system is a set of histories where the probability assigned by a given quantum state of the system to a sum of histories (history operators) is equal to the sum of the probabilities assigned to the individual histories (each operator) by the given state. Abstract consistency alone admits multiple incompatible consistent sets of histories for the same initial state and Hamiltonian, where ‘incompatibility’ in this context means that the sets support conflicting probabilistic inferences. However, this predicament is also faced by any unitary quantum theory describing decoherence with respect to some system/environment split, and not just by the imposition of abstract consistency in isolation. While decoherence always occurs in an approximately unique basis, the final global state resulting from this decoherent evolution can always be rewritten *as if it had* branched along some other, incompatible, consistent set of histories. A quasi-classical realm is a set of decoherent histories where the initial state remains sharply peaked around one history in which dynamical variables approximately follow their classical equations of motion. The requirement of ‘quasi-classicality’ does not necessarily select a unique set or a set of decoherent histories recorded in the environment with the possibility of being measured. Why, then, do measurers of different subsystems of the universe in a final state, which could be expressed in terms of different sets of branching histories corresponding to incompatible quasi-classical realms, always agree on the realm that they are in? I will argue that they agree only because one of three scenarios always occurs: 1) they measure subregions of space on roughly the same length-scale and the set recorded at a specified length-scale is unique, 2) the records of measurement outcomes for different quasi-classical realms occur on different pure states in some global mixture of the universe, or 3) they measure subregions of an environment in a mixed state that fails to record one quasi-classical realm over another. The decoherent histories formalism seems to describe Bell-Kochen-Specker measurement contextuality persisting into the classical limit. By which I mean that the formalism allows that the probability of an outcome common to two different measurements, conditional on the measurement choice and an ontic state, can differ even for measurements of *pointer observables* (such as dial position). However, we never experience this contextuality, and I explain why by arguing that there will always be agreement on the quasi-classical realm between different observers sampling regions of the environment of roughly the same size.Wavefunction branches demand a definition!
Jess Riedel NTT Research
Under unitary evolution, a typical macroscopic quantum system is thought to develop wavefunction branches: a time-dependent decomposition into orthogonal components that (1) form a tree structure forward in time, (2) are approximate eigenstates of quasiclassical macroscopic observables, and (3) exhibit effective collapse of feasibly measurable observables. If they could be defined precisely, wavefunction branches would extend the theory of decoherence beyond the system-environment paradigm and could supplant anthropocentric measurement in the quantum axioms. Furthermore, when such branches have bounded entanglement and can be effectively identified numerically, sampling them would allow asymptotically efficient classical simulation of quantum systems. I consider a promising recent approach to formalizing branches on the lattice by Taylor & McCulloch [Quantum 9, 1670 (2025), arXiv:2308.04494], and compare it to prior work from Weingarten [Found. Phys. 52, 45 (2022), arXiv:2105.04545]. Both proposals are based on quantum complexity and argue that, once created, branches persist for long times due to the generic linear growth of state complexity. Taylor & McCulloch characterize branches by a large difference in the unitary complexity necessary to interfere vs. distinguish them. Weingarten takes branches as the components of the decomposition that minimizes a weighted sum of expected squared complexity and the Shannon entropy of squared norms. I discuss strengths and weaknesses of these approaches, and identify tractable open questions.Decoherence: Out with States, In with Causation
Nicholas Ormrod Perimeter Institute for Theoretical Physics
I introduce a modern perspective on decoherence informed by quantum causal modelling, situate it in its historical context, and show how it resolves two long-standing problems in traditional approaches. Decoherence is often told as a story about states, focused on the suppression of off-diagonal terms in a density matrix through correlation with an environment. Yet this sits uneasily with a key observation already in Zurek (1981): the unitary dynamics alone determine the preferred basis. Recent advances in quantum causal modelling enable a genuinely dynamics-first account: define decoherence directly in terms of causal influence, formalized as noncommutation relations that specify which generators can affect which observables. On this view, diagonal density matrices, correlations, and other state-level features are mere symptoms of decoherence, while decoherence itself is a property of the unitary dynamics. A major payoff of this causal account is that, rather than designating one piece of the universe as the “system” and the rest as the “environment,” one can treat decoherence more democratically, allowing different systems to serve as environments for each other. In turn, this democratic perspective yields a unique consistent set of histories for any subset of unitarily interacting subsystems, and thus addressing the critique of consistent histories by Dowker and Kent (1994) by separating physically meaningful histories from uninformative ones.The Quality/Cosmology Tension for a Post-Inflation QCD Axion
The QCD axion is not only a leading contender as a solution to the Strong CP problem, it is also a natural dark matter candidate. In particular, the post-inflationary axion is particularly attractive because of the possibility of a unique prediction of the axion mass if axion makes up all of dark matter. On the other hand, QCD axion models often suffers from the so-called axion quality problem, where the axion shift symmetry is unprotected from quantum gravity effects. In this talk I will discuss the tension between solutions to the quality problem and viable cosmology for post-inflationary QCD axions. I will start with a simple Z_N solution as an illustrative example of the close connection between the quality problem and the domain wall problem. I will then analyze proposals in the literature that involve more complex symmetry structure and show that they share a set of cosmological issues. Our study suggests that a viable post-inflationary QCD axion model is likely to have a non-standard cosmological history which could significantly impact the prediction of the "correct" axion mass.
Trinity: the Dark Matter Halo—Galaxy—Supermassive Black Hole (SMBH) Connection from z=0-10
Haowen ZhangSupermassive black holes (SMBHs) exist in many galaxies. Their growth is accompanied by strong energy output, which is capable of regulating host galaxy evolution. Understanding SMBH growth is thus critical for studying galaxy formation and evolution. However, it has been difficult to quantify SMBH growth in different galaxies and cosmic epochs. In this talk, I will present Trinity, an empirical technique to determine the typical SMBH mass and growth rate in different galaxies and dark matter halos from z=0-10. I will discuss how the galaxy—SMBH connection from Trinity will help observational astronomers extract more information from data, as well as theoretical astronomers create better simulations of galaxy evolution. In addition, I will give an overview of the Trinity predictions that match latest JWST data, as well as those that do not match observations. Finally, I will talk about how observations with next-generation telescopes will enable a better understanding of the galaxy—SMBH connection, and how Trinity will be helpful for quantifying the constraining power of future observations.
Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
50 years of Black Hole evaporation
Bill Unruh"In 1974, Stephen Hawking predicted that a black hole, formed by the collapse of matter like a star, should not be black, as seemed to be the prediction since the surface is an outward going null ""shell"" from which nothing can escape, but rather should emit a thermal bath of radiation with a temperature inversely proportional to the mass of the black hole. But where does this radiation come from. It cannot be from the inside of the black hole, since it would have to travel faster than light to do so? At almost the same time I predicted that in the vacuum in flat spacetime, an accelerated ""detector"" (atom,photon counter, geiger counter,...) should respond as if surrounded by a thermal bath whose temperature was proportional to its acceleration. This turned out to be closely related to Hawking's result. I will present a very personal history of the past 50 years as we have tried to understant this quantum phenomenon. What is the orgin of Black hole themodynamics? 50 years later this is still one of the big quetions in the overlap between quantum field theory and gravity."