Observations of the cosmic microwave background (CMB) have played a critical role in establishing the current cosmological concordance model. Next generation CMB experiments promise to go even further, and image the very birth of the universe by observing primordial gravitational waves created during the Big Bang. However, observing this signal will be a tremendous challenge, as the expected signal can easy be contaminated by systematic uncertainties both from confusing radiation from the Milky Way and from instrumental imperfections. In this talk, I will describe the BeyondPlanck project, which developed the world's first end-to-end Bayesian CMB analysis code, accounting for both instrumental and astrophysical uncertainties, and applied this to the Planck LFI data set. I will argue that this approach not only sets a new standard for CMB analysis, but also that the main ideas are applicable to any experiment for which systematic error propagation is the main limiting factor.
The Deep Underground Neutrino Experiment (DUNE) aims to make precise measurements of long-baseline neutrino oscillations over a 1300 km baseline. A high power wide-band beam operating in neutrino (anti-neutrino) mode will be produced at Fermilab, with its flux and flavour composition characterised with the Near Detector system. At the Sandford Underground Research Facility (SURF), 1300 km away, deep underground, four gigantic Far Detector modules (with 70 kton total mass) will observe νμ (ν̄μ ) disappearance, νe (ν̄e ) and ντ (ν̄τ ) appearance. In doing so DUNE will be able to determine the Neutrino Mass Ordering (at more than 5 sigmas), measure the CP Violating phase over a wide range of values, measure precisely the oscillation parameters and test the 3-flavour paradigm. With gigantic Far Detectors, deep underground, DUNE will be able to detect neutrinos from a Galactic core-collapse supernova, should one occur, and search for nucleon decay and other physics beyond the Standard Model.
The DUNE Far Detectors will be Liquid Argon Time Projection Chambers (TPC). Such gigantic detectors need large scale prototyping not only to determine the performance but also to validate the technology, engineering solutions, and installation procedures at large-scale. A significant prototyping effort is ongoing at the CERN neutrino platform. Two TPC technologies have been explored at the kton-scale, with the Single Phase and Dual Phase ProtoDUNEs. A hybrid solution, building on the strengths of both, is also being explored.
In this talk I will discuss the physics objectives of DUNE and the far detectors necessary to achieve them.
What the universe is made of is a fundamental question. And yet dark matter, which makes up >80% of the matter in the universe, is of unknown nature. One approach to identifying dark matter is to search for cosmic antiparticles produced when dark matter annihilates or decays. But a “smoking gun” signature is desirable, so that the dark matter generated cosmic particles are not confused with conventional cosmic rays. Low-energy antideuterons have long been known to represent such a “smoking gun”. The General Antiparticle Spectrometer (GAPS) is the only experiment optimized specifically to search for low-energy (< 0.25 GeV/n) cosmic antiprotons, antideuterons, and antihelium. Its goals are (i) to deliver a first-time detection of cosmic antideuterons, an unambiguous signal of new physics that probes a wide array of dark matter models, or to improve upon previous antideuteron limits by two orders of magnitude, (ii.) to provide a precision antiproton spectrum in a previously unexplored energy region, permitting leading constraints on light dark matter, the best limits on primordial black hole evaporation on Galactic length scales and novel constraints on cosmic-ray propagation models, and (iii.) to investigate recent AMS claims of evidence for cosmic antihelium. GAPS will execute three ultra-long duration balloon flights from Antarctica. I will review the current status of antimatter searches for dark matter, and discuss progress on building the GAPS experiment.
The idea to use neutrinos as cosmic messengers dates back to the 1950s,
and the concept of Cherenkov Neutrino Telescopes deep underwater to the
year 1960. It took more than half a century before a detector large
enough to detect cosmic high-energy neutrinos was built: The IceCube
Neutrino Telescope at the South Pole. I will decribe the way towards
IceCube construction and sketch IceCube's main results, starting with
the discovery of a diffuse flux of cosmic neutrinos in 2013. I will also
try a look into the near future, where IceCube will be flanked by
telescopes of similar size in the Mediterranean Sea (KM3NeT) and in Lake
5 last Seminars
The lepton flavor violating (LFV) decay muon -> eee is highly suppressed in the Standard Model (SM) to an unobservable level. The observation of this LFV decay would be a clear signal of physics beyond the SM. The Mu3e collaboration aims to improve the experimental sensitivity by
several orders of magnitudes with respect to the existing bound B(muon -> eee)<1E-12 (90% CL) obtained by the SINDRUM experiment in the year 1988.
After motivating the searches for lepton flavor violation, I will introduce the novel detector concept and discuss the technologies chosen for instrumentation. Emphasis will be given to the main tracking device, an all-pixel tracking detector which is based on High Voltage Monolithic Active Pixel Sensors (HV-MAPS) and exploits an ultra-light design with a radiation length of only 1 per mill per tracking layer.
Finally, the status of the experiment and the planned 2021 integration run will be presented.
I will start with an introduction on the status of cosmology today and more specifically how the observations of the late universe and 3 dimensional maps built by galaxy surveys can help us to characterise the mysterious Dark Energy. I will follow by presenting how cosmic voids, these extra large underdensed regions of the cosmic web, are becoming promising probe of the underlying matter field and will expose more in details the methodology and results on the gravitational lensing signal of cosmic voids (from both background galaxy and Cosmic microwave background) obtained with the Dark Energy Survey galaxy catalogues.
Millisecond pulsars are rapidly rotating neutron stars with phenomenal rotational stability. The NANOGrav collaboration monitors an array of about 80 of these cosmic clocks in order to detect perturbations due to gravitational waves at nanohertz frequencies. These gravitational waves will most likely result from an ensemble of supermassive black hole binaries. Their detection and subsequent study will offer unique insights into galaxy growth and evolution over cosmic time. I will present our most recent dataset and the results of our gravitational wave analysis, which suggests the presence of a common signature in the data that could be the first hints of a gravitational wave background. I will then describe the gains in sensitivity that are expected from additional data, discoveries of millisecond pulsars, more sensitive instrumentation, and international collaboration and discuss prospects for detection in the next several years.
T2K is an accelerator based long baseline neutrino oscillation experiment taking data since 2010 in Japan. The neutrino beam is produced at the J-PARC accelerator complex and neutrinos are detected in a Near Detector complex (ND280) before the oscillations and at the Far Detector (Super-Kamiokande) after the oscillations.
T2K was the first experiment to measure oscillations in the appearance channel and, as it will be shown in this seminar, is now observing first hints of CP violation in the leptonic sector by comparing appearance probabilities of electron neutrinos and antineutrinos.
Such hints are currently limited by statistical uncertainties and T2K is now entering its second phase (T2K-II) consisting in an upgrade of the accelerator complex and of ND280. Such upgrades are expected to be operational in 2022 and will allow to establish CP violation at more than 3 sigma with T2K-II if CP is maximally violated in neutrino oscillations.
T2K-II will be followed by Hyper-K, a Water Cherenkov detector 8 times larger than Super-K. Hyper-K will use the same accelerator complex and near detectors of T2K and it is expected to start data taking in 2027.
Thanks to its large size, Hyper-K will have unprecedented sensitivity to CP violation and to the proton decay and it will be a powerful observatory for atmospheric and solar neutrinos and for neutrinos emitted in supernovae explosions.
In recent years, tensions between measurements and Standard Model predictions in the decays of b-hadrons have hinted at the possible violation of lepton universality, specifically in observables in b→sll and b→clυ transitions. Among them is the ratio of branching fractions R(D*)τμ = B→ D*τυ/ B→ D*μυ. I will discuss the first measurement of the ratio R(D*)eμ = B→ D*eυ / B→ D*μυ at the LHCb experiment at CERN, which will lead towards a combined measurement of all three lepton species in the future.
The LHCb experiment is currently being upgraded for Run 3 of the LHC to record more statistics and therefore reduce the uncertainties of observables testing for example lepton flavor universality. After this upgrade, LHCb will run without a hardware level trigger, resulting in the complete detector being read out at the full bunch-crossing rate of 30 MHz and a maximum data rate of 40 Tbit/s. Events of interest are selected with a software-only trigger in two stages. This allows unprecedented flexibility for trigger selections but at the same time poses a significant computing challenge.
In this seminar, I will also present the first complete high throughput trigger implemented entirely on graphics processing units (GPUs) for an HEP experiment. In LHCb’s High Level Trigger 1 (HLT1) charged particle trajectories and decay vertices are reconstructed to select pp-collisions of interest and reduce the event rate by a factor 30-60. The full HLT1 will be processed on about 200 state of the art GPUs from 2022 onwards. I will discuss the software framework, reconstruction algorithms and performance of the GPU HLT1, as well as ongoing developments towards commissioning.