Format results
Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
Von Neumann, (re-)constructions and hidden variables
Guido BacciagaluppiLecture - Quantum Field Theory I (Core), PHYS 601
Gang Xu Perimeter Institute for Theoretical Physics
Classical mechanics as the high-entropy limit of quantum mechanics
Gabriele Carcassi University of Michigan
The contextual Heisenberg Microscope
Jan-Åke Larsson Linköping University
Bohr and Heisenberg: Debate on the Gamma-Ray Microscope
Noemi Bolzonetti Utrecht University
Einstein, Schrödinger, and the Birth of Wave Mechanics
Don HowardThis talk will explore an underappreciated chapter in the history of the development of wave mechanics. It focuses upon the interactions between Einstein and Schrödinger in the wake of Einstein’s promotion of Bose’s new derivation of the Planck formula for the energy distribution of black-body radiation in 1924 and Einstein’s subsequent extension of Bose’s technique to material bodies. The main claim is that what Schrödinger was really attempting to do was to find a dynamical equation that would provide a bottom-up explanation for why bosons obeyed Bose-Einstein statistics.Lecture - Combinatorial QFT, CO 739-002
Michael Borinsky Perimeter Institute for Theoretical Physics
Building the Cathedral of Quantum Mechanics
Michel JanssenThe upheaval in quantum theory in the mid-1920s is often presented as a Kuhnian paradigm shift. The new quantum mechanics, to use a building metaphor, was erected on the ruins of the old quantum theory, brought down by an accumulation of anomalies. In our book Constructing Quantum Mechanics (2019/2023), Tony Duncan and I use a different building metaphor. As suggested by the subtitles of the two volumes of our book, The Scaffold: 1900–1923 and The Arch: 1923–1927, we see the architects of the new quantum mechanics using parts of the old quantum theory as scaffolds to build the arch of the new one. In this talk, after sketching the underlying alternative to Kuhn's account of scientific revolutions in general, I will give an overview, as non-technical as possible, of the genesis of quantum mechanics in the period 1900–1927.Von Neumann, (re-)constructions and hidden variables
Guido BacciagaluppiI claim that the idea of "reconstructing" quantum mechanics along the lines of quantum probability was there from the beginning of the theory as we know it, in von Neumann's paper "Probabilistic construction of quantum mechanics". In this paper, von Neumann derives (and often introduces for the first time) a slew of now-familiar concepts in quantum mechanics, such as both pure and mixed states (respectively as wavefunctions and—generally unnormalized—density operators), the general form of the Born rule for expectation values, a rather general form of transition probabilities, and a recognisable version of the collapse postulate for the general case of maximal measurements and in a special case of non-maximal measurements. He does so axiomatically, starting from two probabilistic axioms that are neutral between classical and quantum probability and two axioms characterising quantum mechanical quantities in terms of a non-commutative algebra (that of self-adjoint operators in Hilbert space). The presentation of this material in von Neumann's 1932 book is usually understood as a "no-hidden-variables" theorem. However, the sense in which a "completion" of the theory is ruled out is merely that a theory with dispersion-free states would require a different algebraic structure. Grete Hermann pointed this out already in the 1930s (she later retracted the additional claim of "circularity"), and it is arguably also how von Neumann himself saw his result. The more fundamental disagreement between von Neumann and Hermann (and arguably between von Neumann and Bohr) turns out to be about the nature of the quantum state and the status of collapse.Lecture - Quantum Field Theory I (Core), PHYS 601
Gang Xu Perimeter Institute for Theoretical Physics
Public Talk (Panel): 100 Years of Quantum: Perspectives on its Past, Present, and Future
PIRSA:25100159Title: 100 Years of Quantum: Perspectives on its Past, Present, and FutureDescription: This panel format public talk will bring together experts in the history and philosophy of quantum theory and researchers working on various foundational issues to shed new light on the past, present, and future of the theory.
Eventbrite/PION page links to come.
Classical mechanics as the high-entropy limit of quantum mechanics
Gabriele Carcassi University of Michigan
We show that classical mechanics can be recovered as the high-entropy limit of quantum mechanics. The mathematical limit $\hbar \to 0$ can be recovered by decreasing entropy of pure states to minus infinity, in the same way that non-relativistic mechanics can be recovered mathematically by increasing the speed of light c to plus infinty. Physically, these limits are more appropriately understood as a high entropy limit and low speed limit respectively, representing approximations that are independent of underlying mechanism. With this approach, the classical limit is both formally and conceptually similar to the non-relativistic limit, and is independent of interpretation. It also gives an intuitive understanding to the Dirac correspondence principle: it is looking for a theory with lower entropy bound that, at high entropy, recovers classical mechanics. Given that the Moyal bracket is the unique one-parameter Lie-algebraic deformation of the Poisson bracket, quantum mechanics is the only theory that can provide such a lower bound on the entropy.The contextual Heisenberg Microscope
Jan-Åke Larsson Linköping University
The Heisenberg microscope provides a powerful mental image of the measurement process of quantum mechanics (QM), attempting to explain the uncertainty relation through an uncontrollable back-action from the measurement device. However, Heisenberg's proposed back-action uses features that are not present in the QM description of the world, and according to Bohr not present in the world. Therefore, Bohr argues, the mental image proposed by Heisenberg should be avoided. Later developments by Bell and Kochen-Specker shows that a model that contains the features used for the Heisenberg microscope is in principle possible but must necessarily be nonlocal and contextual. In this paper we will re-examine the measurement process within a restriction of QM known as Stabilizer QM, that still exhibits for example Greenberger-Horne-Zeilinger nonlocality and Peres-Mermin contextuality. The re-examination will use a recent extension of stabilizer QM, the Contextual Ontological Model (COM), where the system state gives a complete description of future measurement outcomes reproducing the quantum predictions, including the mentioned phenomena. We will see that the resulting contextual Heisenberg microscope back-action can be completely described within COM, and that the associated randomness originates in the initial state of the pointer system, exactly as in the original description of the Heisenberg microscope. The presence of contextuality, usually seen as prohibiting ontological models, suggests that the contextual Heisenberg microscope picture could perhaps be enabled in general QM.Bohr and Heisenberg: Debate on the Gamma-Ray Microscope
Noemi Bolzonetti Utrecht University
On March 27th, 1927, Heisenberg published the paper in which he introduced the uncertainty relations through his well-known $\gamma$-ray microscope thought experiment. Since then, the debates and commentaries over the origin of the uncertainty relations have continued for a century (see, for instance, Bacciagaluppi and Valentini 2009; Beller 1999; Brown and Redhead 1981; Busch 1990; Hilgevoord and Uffink 1985,2024; Jammer 1974; Uffink 1990). Going back to 1927 through a selection of letters -- including an unpublished letter from Jordan to Rosenfeld -- this talk aims to add a crucial piece to this century-old puzzle by shedding new light on the differences between Heisenberg and Bohr's conceptual basis of the uncertainty relations. According to the received view, Bohr derives the uncertainty relations from the \textit{wave-particle duality} of light. By contrast, it is commonly stated that Heisenberg based them on the \textit{discontinuity} in the interaction between the electron and the light quantum, i.e., the uncontrollable change resulting from the Compton effect (cf. Camilleri 2009). I will challenge the received view by arguing that: (i) Bohr’s derivation of the uncertainty relations fundamentally relies on the use of \textit{classical concepts}, consistent with their central role in his broader interpretation of quantum mechanics. (ii) The supposed contrast rooted in the wave-particle duality of light may stem from Heisenberg’s 1927 misrepresentation (or misinterpretation) of Bohr's ideas on the $\gamma$-ray microscope. This misrepresentation may have influenced the reception of Bohr’s interpretation, particularly regarding his ideas on the conceptual justification for the uncertainty relations, and contributed to obscuring the deeper significance of Bohr's complementarity principle - fully comprehensible only in light of (i).Classical Logic and Quantum Properties
Michael MillerWe articulate a collection of desiderata for an account of the dynamical quantities of a physical theory, and we present a theory that meets these desiderata in the case of quantum mechanics. Our theory retains a distinction between the values of dynamical quantities and the truth values of sentences asserting that a system has a particular value of a particular quantity. This allows our theory to incorporate the phenomenon of quantum indeterminacy as a pattern in the properties instantiated by a system in a particular quantum state, while also retaining the semantics of classical logic. We contrast our theory with quantum logic, which flattens the distinction between quantity values and truth values. We discuss the historical origin of the assumptions leading to this feature of quantum logic, and we address a series of objections that have been raised to previous attempts to reconcile quantum indeterminacy with classical logic.Compact Object Astrophysics in the Multi-messenger Era
Claire YeWe are now in the era of multi-messenger astronomy, where neutron stars and black holes—the most extreme objects in the Universe—can be studied through both electromagnetic signals and gravitational waves. These compact remnants of massive stars provide unique windows into the short lives and deaths of their progenitors and the history of star formation across cosmic time. In this talk, I will show how we can apply cutting-edge computational models to uncover the origins of black holes and neutron stars and to address outstanding puzzles, including how and where the gravitational wave mergers detected by LIGO/Virgo/KAGRA are formed.