Format results
-
-
-
-
Managing the COVID-19 Pandemic across Geography and Demography
Niayesh Afshordi University of Waterloo
-
-
-
-
-
Reinforcement Learning assisted Quantum Optimization
Matteo Wauters SISSA International School for Advanced Studies
-
-
-
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
-
DARWIN - A next-generation observatory for dark matter and neutrino physics
Laura BaudisTwo of the outstanding open questions in physics are the nature of dark matter and the fundamental nature of neutrinos. DARWIN is a next-generation experiment aiming to reach a dark matter sensitivity limited by the irreducible neutrino backgrounds. The core of the detector will have a 40 ton liquid xenon target operated as a dual-phase time projection chamber. The unprecedented large xenon mass, the exquisitely low radioactive background and the low energy threshold will allow for a diversification of the physics program beyond the search for dark matter particles: DARWIN will be a true low-background, low-threshold astroparticle physics observatory. I will present the status of the project, its science reach, and discuss the main R&D topics.
-
A new limit on 0𝝊ΒΒ-decay of 100Mo with scintillating calorimeters from the CUPID-Mo experiment
Benjamin SchmidtThe CUPID-Mo experiment, currently taking data at the Laboratoire Souterrain de Modane (France), is a demonstrator for CUPID, the next-generation upgrade of the first ton-scale cryogenic 0νββ-search, CUORE. The experiment is searching for 0νββ decay of 100Mo with an array of 20 enriched ~0.2 kg Li2MoO4 crystals. The detectors are operated deep under the Frejus mountain at a depth of 4800 m.w.e. in a dilution refrigerator at ~20 mK. They are complemented by cryogenic Ge light detectors allowing us to distinguish alpha from beta/gamma events by the detection of both heat and scintillation light signals. With a bolometric performance of ~ 7 keV energy resolution (FWHM) at 2615 keV, full alpha-to-beta/gamma separation and excellent radio-purity levels, we operate in the background free regime. For the present analysis, we consider more than one year of data acquired between March 2019 and April 2020. With 2.17 kg x yr of exposure and a high analysis efficiency of ~ 90%, we are able to set a new world leading limit for 0νββ decay of 100Mo. In this seminar, I will present the details of the analysis, the new limit of T1/2 > 1.4 x 1024 yr at 90% c.i. and I will conclude with an outlook on the data taken up to the end of CUPID-Mo operations in July 2020 and further upcoming analyses.
-
An update on SNO+
Jeanne WilsonSNO+ is a multi-purpose, low background liquid scintillator detector located in the SNOLAB facility. This talk will present our progress towards the main goal of SNO+: probing the mass and nature of neutrinos through a search for neutrino less double beta decay. By loading large amounts of natural tellurium into a homogeneous liquid scintillator detector SNO+ is pioneering an affordable and extendable approach to this rare decay search with the isotope 130Te. I will also discuss other physics reach of SNO+ including reactor, solar and supernova neutrinos and invisible nucleon decay. I will present the results for previous water phase operations and the current status of scintillator filling, tellurium plant preparation and background studies.
-
Managing the COVID-19 Pandemic across Geography and Demography
Niayesh Afshordi University of Waterloo
What factors drive the growth and decay of a pandemic? Can a study of community differences (in demographics, settlement, mobility, weather, and epidemic history) allow these factors to be identified? Has “herd immunity” to COVID-19 been reached anywhere? What are the best steps to manage/avoid future outbreaks in each community? We analyzed the entire set of local COVID-19 epidemics in the United States; a broad selection of demographic, population density, climate factors, and local mobility data, in order to address these questions. What we found will surprise you! (based on arXiv:2007.00159)
-
Challenges for dark matter detection
Marie-CĂ©cile PiroThe searches for solving the greatest mysteries of our Universe require ultra-sensitive detectors and an extreme control of the environment and the background in order to detect a rare signal. Over the last decades, technologies have reached such unprecedented sensitivity levels that never-before-seen background signals must be considered. In this talk I will give an overview of the requirements for low background detection and what are the current R&D effortsfor developing new cutting-edge technologies in order to address the common challenges of experiments and for pushing the limits of detector performance.
-
Dark and shiny dresses around black holes
Daniele GaggeroThe discovery of gravitational wave signals from merger events of massive binary-black-hole (BBH) systems have prompted a renewed debate in the scientific community about the existence of primordial black holes (PBHs) of O(1-100) solar masses. These objects may have formed in the early Universe and could constitute a significant portion of the elusive dark matter that, according to standard cosmology, makes up the majority of the matter content in the universe. I will review the most recent developments of this field, with focus on multi-messenger prospects of detection. In the first part of the talk, I will present the prospects of discovery for both a hypothetical PBH population and the guaranteed population of astrophysical isolated black holes in our Galaxy, based on the radio and X-ray emission from the interstellar gas that is being accreted onto them (the “shiny dresses”). A future detection will be possible thanks to the expected performance of forthcoming radio facilities such as SKA and ngVLA. Then, I will turn my attention to scenarios where primordial black holes constitute a sub-dominant component of the dark matter, and study the impact of dark matter mini-spikes that are expected to form around them (the “dark dresses”) on several observables. In this context, I will first present an updated computation of the PBH merger rate as a function of DM fraction and redshift that takes into account the impact of the dark dresses. Then, I will discuss the observational prospects of these dresses in binary systems composed of a stellar-mass and an intermediate-mass black hole: I will show a novel calculation of the dephasing of the gravitational waveform induced by the DM spike, potentially detectable with the LISA space interferometer.DARK AND SHINY DRESSES AROUND BLACK HOLES DANIELE GAGGERO (UAM)July 6, 2020 Zoom Line: https://laurentian.zoom.us/j/92591146494
-
Opportunities and challenges in precision physics at the LHC
Lorenzo TancrediAfter the discovery of the Higgs boson in 2012, the Large Hadron Collider (LHC) at CERN has turned from a discovery machine to a precision machine. The highly boosted events measured by the LHC experiments are, for the first time, providing us a window on the details of the electroweak symmetry breaking mechanism. A crucial condition to maximise the reach of these studies is a profound understanding of the theoretical implications of perturbative Quantum Field Theory, and in particular of Quantum ChromoDynamics (QCD), for the physics of hadronic collisions at the LHC. In this talk, I will provide an account of the opportunities and the challenges that precision physics at the LHC can offer, focusing in particular on the recent developments in our understanding of higher order calculations in perturbative Quantum Field Theory and how they can help us understand the Higgs sector of the Standard Model.
-
Probing GRB physics through high-energy observations with Fermi
Elisabetta BissaldiThe Fermi Gamma-ray Space Telescope has provided unique insights into the Universe's biggest explosions over the past 12 years. With thousands of gamma-ray bursts (GRBs) detected by the Gamma-ray Burst Monitor (GBM) and hundreds by the Large Area Telescope (LAT), we have learned about the broad properties of the populations of these events and got unique insights into their emission mechanisms, environment, and physical properties. In this seminar, I'll review highlights of GRB science from the Fermi mission at low (keV) and high (GeV) energy, as well as the recent discovery of very-high (TeV) energy emission from GRB 180720B and GRB 140114C observed by the Cherenkov Telescopes of the H.E.S.S. and MAGIC experiments, respectively
-
Reinforcement Learning assisted Quantum Optimization
Matteo Wauters SISSA International School for Advanced Studies
We propose a reinforcement learning (RL) scheme for feedback quantum control within the quantum approximate optimization algorithm (QAOA). QAOA requires a variational minimization for states constructed by applying a sequence of unitary operators, depending on parameters living in a highly dimensional space. We reformulate such a minimum search as a learning task, where a RL agent chooses the control parameters for the unitaries, given partial information on the system. We show that our RL scheme learns a policy converging to the optimal adiabatic solution for QAOA found by Mbeng et al. arXiv:1906.08948 for the translationally invariant quantum Ising chain. In presence of disorder, we show that our RL scheme allows the training part to be performed on small samples, and transferred successfully on larger systems. Finally, we discuss QAOA on the p-spsin model and how its robustness is enhanced by reinforce learning. Despite the possibility of finding the ground state with polynomial resources even in the presence of a first order phase transition, local optimizations in the p-spsin model suffer from the presence of many minima in the energy landscape. RL helps to find regular solutions that can be generalized to larger systems and make the optimization less sensitive to noise.
References
-
Quantum homeopathy works: Efficient unitary designs with a system-size independent number of non-Clifford gates
Ingo Roth Freie Universität Berlin
Many quantum information protocols require the implementation of random unitaries. Because it takes exponential resources to produce Haar-random unitaries drawn from the full n-qubit group, one often resorts to t-designs. Unitary t-designs mimic the Haar-measure up to t-th moments. It is known that Clifford operations can implement at most 3-designs. In this work, we quantify the non-Clifford resources required to break this barrier. We find that it suffices to inject O(t^4 log^2(t) log(1/ε)) many non-Clifford gates into a polynomial-depth random Clifford circuit to obtain an ε-approximate t-design. Strikingly, the number of non-Clifford gates required is independent of the system size – asymptotically, the density of non-Clifford gates is allowed to tend to zero. We also derive novel bounds on the convergence time of random Clifford circuits to the t-th moment of the uniform distribution on the Clifford group. Our proofs exploit a recently developed variant of Schur-Weyl duality for the Clifford group, as well as bounds on restricted spectral gaps of averaging operators. Joint work with J. Haferkamp, F. Montealegre-Mora, M. Heinrich, J. Eisert, and D. Gross.
-
Cayley path and quantum supremacy: Average case #P-Hardness of random circuit sampling
Ramis Movassagh MIT-IBM Watson AI Lab
Given the large push by academia and industry (e.g., IBM and Google), quantum computers with hundred(s) of qubits are at the brink of existence with the promise of outperforming any classical computer. Demonstration of computational advantages of noisy near-term quantum computers over classical computers is an imperative near-term goal. The foremost candidate task for showing this is Random Circuit Sampling (RCS), which is the task of sampling from the output distribution of a random circuit. This is exactly the task that recently Google experimentally performed on 53-qubits.
Stockmeyer's theorem implies that efficient sampling allows for estimation of probability amplitudes. Therefore, hardness of probability estimation implies hardness of sampling. We prove that estimating probabilities to within small errors is #P-hard on average (i.e. for random circuits), and put the results in the context of previous works.
Some ingredients that are developed to make this proof possible are construction of the Cayley path as a rational function valued unitary path that interpolate between two arbitrary unitaries, an extension of Berlekamp-Welch algorithm that efficiently and exactly interpolates rational functions, and construction of probability distributions over unitaries that are arbitrarily close to the Haar measure. -
Classical algorithms, correlation decay, and complex zeros of partition functions of quantum many-body systems
Mehdi Soleimanifar California Institute of Technology (Caltech)
Basic statistical properties of quantum many-body systems in thermal equilibrium can be obtained from their partition function. In this talk, I will present a quasi-polynomial time classical algorithm that estimates the partition function of quantum many-body systems at temperatures above the thermal phase transition point. It is known that in the worst case, the same problem is NP-hard below this temperature. This shows that the transition in the phase of a quantum system is also accompanied by a transition in the computational hardness of estimating its statistical properties. The key to this result is a characterization of the phase transition and the critical behavior of the system in terms of the complex zeros of the partition function. I will also discuss the relation between these complex zeros and another signature of the thermal phase transition, namely, the exponential decay of correlations. I will show that in a system of n particles above the phase transition point, where the complex zeros are far from the real axis, the correlation between two observables whose distance is at least log(n) decays exponentially. This is based on joint work with Aram Harrow and Saeed Mehraban.