Format results
-
-
The Galactic Real Estate Market: The Physics and Chemistry of Habitability
Penelope Boston National Cave and Karst Research Institute
PIRSA:13120044 -
Jets Without Jets
Daniele Bertolini Massachusetts Institute of Technology (MIT)
-
Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
-
Fully exploring exotic production of the 125 GeV Higgs
Felix Yu Fermi National Accelerator Laboratory (Fermilab)
-
-
-
Divergences in Spinfoam Quantum Gravity
Aldo Riello Perimeter Institute for Theoretical Physics
-
Collisions in AdS and the thermalisation of heavy-ion collisions
Wilke van der Schee European Organization for Nuclear Research (CERN)
-
-
-
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
-
Direct Detection of Classically Undetectable Dark Matter through Quantum Decoherence
Jess Riedel NTT Research
Although various pieces of indirect evidence about the nature of dark matter have been collected, its direct detection has eluded experimental searches despite extensive effort. If the mass of dark matter is below 1 MeV, it is essentially imperceptible to conventional detection methods because negligible energy is transferred to nuclei during collisions. Here I propose directly detecting dark matter through the quantum decoherence it causes rather than its classical effects such as recoil or ionization. I show that quantum spatial superpositions are sensitive to low-mass dark matter that is inaccessible to classical techniques. This provides new independent motivation for matter interferometry with large masses, especially on spaceborne platforms. The apparent dark matter wind we experience as the Sun travels through the Milky Way ensures interferometers and related devices are directional detectors, and so are able to provide unmistakable evidence that decoherence has galactic origins. -
The Galactic Real Estate Market: The Physics and Chemistry of Habitability
Penelope Boston National Cave and Karst Research Institute
PIRSA:13120044Exoplanets, planets circling distant stars, are proving to be an extraordinary source of new thinking about the potential for life beyond Earth. Until recently, we have assumed that our Solar System and its planets were probably representative of such systems elsewhere. But the amazing array of very odd exoplanets that are being uncovered have stimulated a renaissance of thought on the subject of potential homes for life in the universe. Combined with work on extreme lifeforms here on Earth and intensive study of Mars and several other planets and moons in our system, new paradigms for life search missions are emerging. Science fiction has long drawn from and extrapolated out from science, but the cross-fertilization has gone both ways. Some of the more outrageous planets incorporated into fiction in the past may not be so outrageous after all. I will discuss what we think we know about exoplanets so far, how they are detected, how we are beginning to characterize their environments, and ideas about what this means for our search for living neighbors in our galaxy, whether they be microbes or folks we can actually chat with some day. -
Jets Without Jets
Daniele Bertolini Massachusetts Institute of Technology (MIT)
Jets are key tools for physics at the LHC. Usually, jets are identified through a jet algorithm. In this talk, I will present an alternative way of thinking about jets, by showing how a broad class of inclusive jet-based observables can be replaced by event shapes. These event shapes do not require any jet clustering, but they still implement a jet-like pT cut on "jets" with an R-like radius. I will discuss various applications, including event selection at trigger-level, event-wide trimming, and alternative definitions for boosted objects identifiers. -
Quantum Mechanics as Classical Physics
Charles Sebens University of Michigan–Ann Arbor
On the face of it, quantum physics is nothing like classical physics. Despite its oddity, work in the foundations of quantum theory has provided some palatable ways of understanding this strange quantum realm. Most of our best theories take that story to include the existence of a very non-classical entity: the wave function. Here I offer an alternative which combines elements of Bohmian mechanics and the many-worlds interpretation to form a theory in which there is no wave function. According to this theory, all there is at the fundamental level are particles interacting via Newtonian forces. In this sense, the theory is classical. However, it is still undeniably strange as it posits the existence of many worlds. Unlike the many worlds of the many-worlds interpretation, these worlds are fundamental, not emergent, and are interacting, not causally isolated. The theory will be presented as a fusion of the many-worlds interpretation and Bohmian mechanics, but can also be seen as a foundationally clear version of quantum hydrodynamics. A key strength of this theory is that it provides a simple and compelling story about the connection between the amplitude-squared of the wave function and probability. The theory also gives a natural explanation of the way the wave function transforms under time reversal and Galilean boosts. -
Fully exploring exotic production of the 125 GeV Higgs
Felix Yu Fermi National Accelerator Laboratory (Fermilab)
I consider the effects of exotic production modes of the 125 GeV Higgs and their impact on Higgs searches and the Higgs discovery. I emphasize that new production modes have been largely overlooked in contemporary tests of the Standard Model nature of the Higgs boson but experimental tests of exotic production modes are viable now or will be soon. I present a couple explicit examples of exotic production arising from chargino-neutralino associated production in the MSSM. As a corollary of this work, I point out that current Higgs coupling fits do not adequately explore the complete space of new physics deviations possible in Higgs measurements. -
Acceleration, Then and Now
Cliff Burgess McMaster University
There is good evidence that the universe underwent an epoch of accelerated expansion sometime in its very early history, and that it is entering a similar phase now. This talk is in two parts. The first part describes what I believe to be the take-home message about inflationary models, coming both from the recent Planck results and from attempts to embed inflation within a UV completion (string theory). I will argue that both point to a particularly interesting class of inflationary models that also evade many of the tuning problems of inflation. These models also turn out to make the tantalizing prediction that the scalar-to-tensor ratio, r, could be just out of reach, being predicted to be proportional to (n_s - 1)^2, where n_s ~ 0.96 is the spectral tilt of the scalar spectrum. The second part provides an update on an approach to solving the "cosmological constant problem", which asks why the vacuum energy seems to gravitate so little. This is the main theoretical obstruction that makes it so difficult to understand the origins of the present epoch of acceleration. In the approach described - Supersymmetric Large Extra Dimensions - observations can be reconciled with a large vacuum energy because the vacuum energy curves the extra dimensions and not the ones measured in cosmology. It leads to a picture of very supersymmetric gravity sector coupled to a completely non-supersymmetric particle-physics sector (which predicts in particular no superpartners to be found at the LHC). The update presented here summarizes the underlying mechanism whereby supersymmetry in the extra dimensions acts to suppress the gravitational effects of quantum fluctuations. Because the large quantum contributions are under control it becomes possible to estimate the size of to be expected of the observed dark energy. For the simplest configuratin the result is of order C (m Mg/4 pi Mp)^4, where m is the heaviest particle on the branes (and so no smaller than the top quark mass), Mg is the extra-dimensional gravity scale (no smaller than 10 TeV due to astrophysical constraints, implying two extra dimensions that are of order a micron in size) and Mp is the 4D Planck mass. C is a constant unsuppressed by symmetry-breaking effects, and C = 6 x 10^6 gives the observed dark energy density, using the smallest values given above for m and Mg. If there is time I will sketch arguments as to why there must be other light degrees of freedom in the theory as well, whose implications might ultimately be used to test the picture. -
Cornering Gluinos at the LHC
Jared Evans Rutgers University
Gluinos are expected to be light for a natural electroweak scale, but the LHC has not seen them yet. Many possibilities have been proposed to hide natural gluinos in the LHC data, but are these methods really effective? In this talk, I will discuss the current status of kinematically accessible gluinos. By noting the most common features - MET, tops, and high multiplicity - which pervade natural gluino decays, I will argue that there are few places left to hide. I will briefly discuss the remaining weaknesses in LHC coverage and how to bolster them. -
Divergences in Spinfoam Quantum Gravity
Aldo Riello Perimeter Institute for Theoretical Physics
The most relevant evidences in favour of the Lorentzian EPRL-FK spinfoam model come from its capibility of reproducing the expected semiclassical limit in the large spin regime. The main examples of this are the large spin limit of the vertex amplitude, later extended to arbitrary triangulations, and that of the spinfoam graviton propagator, which was calculated on the simplest possible two complex. These results are very promising. Nonetheless, their relevance may be endangered by the effects associated to radiative corrections. In this seminar, I will focus on the role played by the simplest diverging graph, the so called 'melon graph', which is known to play a fundamental role in tensorial group field theories. In particular, I will discuss its most divergent part and its geometrical interpretation. I will finally comment on the result, with particular attention to its physical consequences, especially in relation with the semiclassical limit of the spinfoam graviton propagator. -
Collisions in AdS and the thermalisation of heavy-ion collisions
Wilke van der Schee European Organization for Nuclear Research (CERN)
The motivation of this seminar is to understand the thermalisation of heavy ion collisions using AdS/CFT. These collisions can be modelled as colliding planar gravitational shock waves. This gives rise to rich and interesting dynamics; wide shocks come to a full stop and expand hydrodynamically, as was previously found by Chesler and Yaffe. High energy collisions (corresponding to thin shocks) pass through each other, after which a plasma forms in the middle, within a proper time 1/T, with T the local temperature at that time. After this I will discuss recent results where we studied the influence of microscopic structure in the longitudinal direction of the shock waves, and thereby found a coherent regime. This has implications for both fluctuations in nucleus-nucleus collisions, and for recent proton-lead collisions at at LHC. The final part will cover a radially expanding calculation, where some simplifications allowed us to solve the model all the way till the final particle spectra, with an interesting comparison with experimental data. -
Homological Product Codes
Sergey Bravyi IBM (United States)
Quantum codes with low-weight stabilizers known as LDPC codes have been actively studied recently due to their potential applications in fault-tolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. Here we propose the first family of good quantum codes with low-weight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most square root of n qubits, where n is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. Our proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight n^a for any a>0. Finally, we apply the homological product to construct new small codes with low-weight stabilizers. This is a joint work with Matthew Hastings. -
Insightful supersymmetry
Erich Poppitz University of Toronto
It has recently been realized that some studies of supersymmetric gauge theories, when properly interpreted, lead to insights whose importance transcends supersymmetry. I will illustrate the insightful nature of supersymmetry by two examples having to do with the microscopic description of the thermal deconfinement transition, in non-supersymmetric pure Yang-Mills theory and in QCD with adjoint fermions. A host of strange ``topological" molecules will be seen to be the major players in the confinement-deconfinement dynamics. Interesting connections between topology, ``condensed-matter" gases of electric and magnetic charges, and attempts to interpret the divergent perturbation series will emerge. Much of the presentation will be aimed at non-experts. -
Bounding the Elliptope of Quantum Correlations & Proving Separability in Mixed States
Elie Wolfe Perimeter Institute for Theoretical Physics
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories. We use such marginal-involving quantum bounds to estimate the volume of all possible quantum statistics in the complete 8-dimensional probability space of the Bell-CHSH scenario, measured relative to both local hidden variable models as well as general no-signaling theories. See arXiv:1106.2169. Time permitting, we’ll also discuss how one might go about trying to prove that a given mixed state is, in fact, not entangled. (The converse problem of certifying non-zero entanglement has received extensive treatment already.) Instead of directly asking if any separable representation exists for the state, we suggest simply checking to see if it “fits” some particular known-separable form. We demonstrate how a surprisingly valuable sufficient separability criterion follows merely from considering a highly-generic separable form. The criterion we generate for diagonally-symmetric mixed states is apparently completely tight, necessary and sufficient. We use integration to quantify the “volume” of states captured by our criterion, and show that it is as large as the volume of states associated with the PPT criterion; this simultaneously proves our criterion to be necessary as well as the PPT criterion to be sufficient, on this family of states. The utility of a sufficient separability criterion is evidenced by categorically rejecting Dicke-model superradiance for entanglement generation schema. See arXiv:1307.5779.