Thématique : Physique des particules
Physics at the Large Hadron Collider (LHC) at CERN (European Organization for Nuclear Research) is the high priority research field of the Particle Physics community worldwide. ATLAS is one of the two general purpose experiments installed at the LHC that discovered a Higgs boson in July 2012, key piece for the understanding of the fundamental interactions and the origin of elementary particle mass. Its physics program extends beyond Higgs property measurements to the search for signs of physics beyond the Standard Model of particle physics. The upgrade of the LHC is a crucial part of the European strategy for particle physics. The second phase of the LHC upgrade will happen in 2025 and will increase by an order of magnitude the instantaneous luminosity leading to the High Luminosity LHC (HL-LHC). The increased luminosity puts more stringent requirements on the LHC detectors electronics and data processing. The ATLAS detector will undergo a major update to be adapted to the increasing luminosity at the HL-LHC, hence to the dramatic increase of produced data. In order to treat on the fly with advanced algorithm this huge amount of data (more than 500 Tb/s), we propose to deploy advanced technologies based on state-of-the-art digital electronics running AI algorithms.
Artificial Intelligence (AI) algorithms and machine learning techniques are nowadays one of the most expending fields in research and in the industry. The use of AI in experimental particle physics is not new but these algorithms are only used, for now, in later stages of data analysis such as the analyses leading to the recent discovery of the Higgs boson coupling to third generation quarks  for which the ATLAS group of the “Centre de Physique des Particules de Marseille” (CPPM) had a major contribution, namely the development of AI techniques for these analyses. For data acquisition and trigger applications, relatively simple algorithms are imbedded in the hardware to process on the fly the huge data flow in a timely manner. However, with the next generation of high-end Field Programmable Gate Arrays (FPGAs), that include large increase of available processing and memory units, it is becoming possible to implement complex AI algorithms inside these FPGAs and process on the fly big data flows with dramatically increased selection performance.
The ATLAS group of the “Centre de Physique des Particules de Marseille” (CPPM) is deeply involved in this scientific program, in particular linked to its expertise of the electromagnetic calorimeter. The latter is a key component for the identification and energy measurement of electrons and photons, which were at the core of the Higgs boson discovery. Moreover, for the upgrade of the accelerator performances foreseen in 2021, this calorimeter has a major ongoing development program to dramatically upgrade its trigger and readout to which the CPPM group actively contributes.
The main purpose of this project is to develop AI and machine learning techniques to dramatically improve big data processing effectiveness such the one needed in high pileup environment at the LHC. The main challenge is to efficiently implement these techniques into the dedicated data acquisition electronics, based on FPGAs, which are used for signal processing in particle physics detectors such as the ATLAS Liquid Argon (LAr) calorimeter and which are under development by the ATLAS CPPM group.
The signals from the LAr calorimeter are processed through a chain of electronic boards in order to extract the energy deposited in the calorimeter. The new electronic chain for the second phase of the LHC upgrade is described in . An excellent resolution on the deposited energy and an accurate detection of the deposited time, in the blurred environment created by the pileup, is crucial for the operation of the calorimeters and of the full ATLAS detector to enhance its physics discovery potential. The computation of the deposited energy and timing is currently done using optimal filtering algorithms . These filter algorithms are perfectly adapted for ideal situations with low noise. However, with the increased luminosity and thus the noise from pileup, the performance of the filter algorithms decreases significantly while no further extension nor tuning of those could recover the loss in performance.
AI algorithms have proven to be very powerful tools in data processing and provide the most interesting candidate to recover the performance of filter algorithms in high noise conditions. FPGAs, which are designed to efficiently treat a large amount of data in a very short time, are very much adapted to the online data processing needed at the LHC especially at the trigger level. FPGAs had, up to recently, relatively limited amount of computational resources, however high-end FPGAs have now enough resources to accommodate the needs of advanced AI and deep learning algorithms. This allows to combine the performance of AI algorithms with the speed and high bandwidth of the FPGAs to efficiently process the big data flow by electronic boards.
The backend electronic boards for the second phase of the upgrade of the LAr calorimeter (called LASP) will use the next high-end generation of FPGAs. Based on the unique skills and expertise present at CPPM in digital electronics, a prototype of these boards is currently being developed at CPPM and will be finalized in 2020. This prototype will be equipped with two high-end last generation FPGAs from INTEL/ALTERA (Worldwide leader in FPGA production and part of the INTEL group ). The research and development (RetD) program of these boards and their production is already financed as part of the government “Très Grandes Infrastructures de Recherche” (TGIR) program for the upgrade of the ATLAS detector. The aim of this project is to take advantage of this unique opportunity to develop the necessary tools enabling the embedding of AI algorithms on these boards and to further explore the outstanding capabilities opened by these developments for new applications. This can prove to be a breakthrough that can extend to many areas facing big data processing in particle physics, especially at the trigger level, and in the industry.
The objectives of this project can be divided into 6 main points:
1. Develop AI methods adapted to the specific problem of signal processing to compute the deposited energy in the calorimeter in high noise conditions.
2. Optimize these methods and compare them with the existing filter algorithms using simulated data that reflect the conditions of the LHC after the upgrade.
3. Adapt the algorithms for processing on FPGAs and optimize the needed processing power while keeping high performance.
4. Investigate and adapt the recent tools that are under developments for converting AI algorithms into HDL code that is used to program FPGAs.
5. Test the performance of the algorithms in-situ using the LASP board prototype currently under construction at CPPM.
6. Generalize the developed tools and study their wider usage for trigger processing and for applications outside the particle physics field including industrial applications.
Although this project is an experimental particle physics project, its reach extends to any field profiting from big data processing using AI algorithms on specific hardware such as FPGAs. This project provides a unique opportunity to enhance the multidisciplinary and industrial applications of the research and development programs at CPPM. Due to the promising industrial applications, a collaboration with Nexvision , a midsize company based in Marseille and leader in embedded civil security systems, is developed. Support from the Aix Marseille University Initiative of Excellence program, so-called AMIDEX, is already obtained for this project. Using the AMIDEX funding, a new postdoc has recently joined the CPPM group in January 2020 to work on this project for 2 years.
In this framework, the subject of this M2 internship is to participate to the development of the needed AI algorithms with the CPPM group in close interaction with a postdoc and PhD students. More specifically the student will implement selected algorithms into the ATLAS event reconstruction framework. S.He will then study the performance of these algorithms on simulated events and will compare them to the actual energy reconstruction method currently in use. Finally the aim will be to measure their impact on electron and photon identification in different data taking conditions. During the internship, the student may have to go to CERN to interact with experts and present his work. The research work will combine analysis on real and simulated data as well as studies and operation of experimental systems. This internship can naturally then evolve to a thesis (see the marwww.in2p3.fr web site for its description).
 The ATLAS Collaboration, Observation of Higgs boson production in association with a top quark pair at the LHC with the ATLAS detector, Phys. Lett. B 784 (2018) 173-191; arXiv:1806.00425
 The ATLAS Collaboration, Observation of H->bb decays and VH production with the ATLAS detector, Phys. Lett. B 786 (2018) 59-86; arXiv:1808.08238
 The ATLAS Collaboration, Technical Design Report for the Phase-II Upgrade of the ATLAS LAr Calorimeter, CERN-LHCC-2017-018, https://cds.cern.ch/record/2285582
 Cleland, W.E. and Stern E.G., Signal processing considerations for liquid ionization calorimeters in a high rate environment. NIM 338 p. 467. 1994
 ALTERA, world leading company in FPGA production recently acquired by INTEL, https://www.intel.com/content/www/us/en/products/programmable.html
 NEXVISION SAS, French company (based in Marseille) specialized in electronic reference design, https://nexvision.fr/