# Posters

2022-01-13
14:33
NEW GENERATION OFFLINE SOFTWARE FOR THE LHCb UPGRADE I
 Reference: Poster-2022-1063 Created: 2022. -1 p Creator(s): Ferrillo, Martina The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC's Run 3, which is scheduled to begin in 2022. The increased data rate in Run 3 poses significant data-processing and handling challenges for the LHCb experiment. The offline computing and dataflow model is consequently also being upgraded to cope with the factor 30 increase in data volume and associated demands of user-data samples of ever-increasing size. Coordinating these efforts is the charge of the newly created Data Processing and Analysis (DPA) project. The DPA project is responsible for ensuring the LHCb experiment can efficiently exploit the Run 3 data, dealing with the data from the online system with central skimming/slimming (a process known as "Sprucing") and subsequently producing analyst-level ntuples with a centrally managed production system (known as "Analysis Productions") utilising improved analysis tools and infrastructure for continuous integration and validation.It is a multi-disciplinary project involving collaboration between computing experts, trigger experts and physics analysis experts. This talk will present the evolution of the data processing model, followed by a review of the various activities of the DPA project. The associated computing, storage and network requirements are also discussed. © CERN Geneva Access to files

2022-01-13
14:29
Scintillating sampling ECAL technology for the Upgrade II of LHCb
 Reference: Poster-2022-1062 Created: 2022. -1 p Creator(s): Betti, Federico The aim of the LHCb Upgrade II is to operate at a luminosity in the range of 1 to 2 x 1034 cm−2 s−1 to collect a data set of 300 fb−1. This will require a substantial modification of the current LHCb ECAL due to high radiation doses in the central region and increased particle densities. The ECAL has to provide good energy and position resolutions in these conditions. Timing capabilities with tens of picoseconds precision for neutral electromagnetic particles and increased granularity with dense absorber in the central region are needed for pile-up mitigation. Several scintillating sampling ECAL technologies are currently being investigated for this purpose: Spaghetti Calorimeter (SpaCal) with garnet scintillating crystals and tungsten absorber, SpaCal with scintillating plastic fibres and tungsten or lead absorber, and Shashlik with polystyrene tiles, lead absorber and fast WLS fibres. Results from an ongoing R&D campaign to optimise the Upgrade II ECAL are shown. This includes studies of radiation-hard scintillation materials, performance optimisation using detailed simulations and test beam measurements. The presentation also includes an overview of the overall plans for the Upgrade II of the LHCb ECAL. © CERN Geneva Access to files

2021-12-10
14:14
New Web Based Event Data and Geometry Visualization for LHCb
 Reference: Poster-2021-1061 Created: 2021. -1 p Creator(s): Pappas, Andreas The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC’s Run 3, which is scheduled to begin in 2022. The new Run 3 detector has a different, upgraded geometry and uses new tools for its description, namely DD4hep and ROOT. Besides, the visualization technologies have evolved quite a lot since Run 1, with the introduction of ubiquitous web based solutions or Augmented Reality (AR) for example. The LHCb collaboration has thus started the development of a new visualization solution, based on the Phoenix framework, developed jointly by several experiments in the context of the HEP Software Foundation (HSF). We present here the architecture and implementation of this new solution, as well as the different contributions made to the Phoenix ecosystem. In particular we discuss a generic tool for exporting ROOT geometries to the visualization application, which can be used to display in a browser either the whole detector or subparts of it. Extensions t © CERN Geneva Access to files

2021-12-10
14:11
Deep Learning Particle Identification in LHCb RICH
 Reference: Poster-2021-1060 Created: 2021. -1 p Creator(s): Blago, Michele Piero The use of Ring Imaging Cherenkov detectors (RICH) offers a powerful technique for identifying the particle species in particle physics. These detectors produce 2D images formed by rings of individual photons superimposed on a background of photon rings from other particles. The RICH particle identification (PID) is essential to the LHCb experiment at CERN. While the current PID algorithm has performed well during LHC data-taking periods between 2010 to 2018, its complexity poses a challenge for LHCb computing infrastructure upgrades towards multi-core architectures. The high particle multiplicity environment of future LHC runs strongly motivates shifting towards high-throughput computing for the online event reconstruction. In this contribution, we introduce a convolutional neural network (CNN) approach to particle identification in LHCb RICH. The CNN takes binary input images from the two RICH detectors to classify particle species. The input images are polar-transformed sub-sections of the RICH photon-detection planes. The model is hyperparameter-optimised and trained on classification accuracy with simulated collision data for the upcoming LHC operation starting in 2022. The PID performance of the CNN is comparable to the conventional algorithm, and its simplicity renders it suitable for fast online reconstruction through parallel processing. We show that under conditions of reduced combinatorial background, as expected from the introduction of timing resolution to the RICH detectors in future upgrades, the network achieves a particle identification performance close to 100 %, with simultaneous misclassification of the most prevalent particle species approaching 0 %. © CERN Geneva Access to files

2021-12-10
14:08
Generative models uncertainty estimation
 Reference: Poster-2021-1059 Created: 2021. -1 p Creator(s): Kazeev, Nikita In recent years fully-parametric fast simulation methods based on generative models have been proposed for a variety of high-energy physics detectors. By their nature, the quality of data-driven models degrades in the regions of the phase space where the data are sparse. Since machine-learning models are hard to analyze from the physical principles, the commonly used testing procedures are performed in a data-driven way and can’t be reliably used in such regions. In our talk we propose three methods to estimate the uncertainty of generative models inside and outside of the training phase space region, along with data-driven calibration techniques. Test of the proposed methods on the LHCb RICH fast simulation is also presented. © CERN Geneva Access to files

2021-12-10
14:05
New software technologies in the LHCb Simulation
 Reference: Poster-2021-1058 Created: 2021. -1 p Creator(s): Mazurek, Michal The LHCb Experiment at the Large Hadron Collider (LHC) at CERN has successfully performed a large number of physics measurements during Runs 1 and 2 of the LHC. It will resume operation in Run3 with an upgraded detector to process events with up to five times higher luminosity. Monte Carlo simulations are key to the commissioning of the new detector and the interpretation of past and future measurements. In order to cope with the amount of simulated samples required for the LHCb future physics program, new simulation software technologies have to be introduced to produce them within the computing resources allocated for the next few years. Therefore, the LHCb collaboration is currently preparing a new version of its GAUSS simulation framework. The new version provides the LHCb specific functionality while its generic simulation infrastructure has been encapsulated in an experiment independent framework, GAUSSINO. The latter combines the GAUDI core software framework and the GEANT simulation toolkit and fully exploits their multi-threading capabilities. A prototype of a fast simulation interface to the simulation toolkit is being developed as the last addition to GAUSSINO to provide an extensive palette of fast simulation models, including new deep learning based options. © CERN Geneva Access to files

2021-11-26
14:53
CERN - FCC exhibition
CERN - FCC Exhibition

 Reference: Poster-2021-1057 Keywords:  FCC  Exhibition Created: 2021. -36 p A key recommendation of last year’s update to the European Strategy for Particle Physics is that Europe, in collaboration with the worldwide community, should undertake a feasibility study for a next-generation hadron collider. As a result, the Future Circular Collider (FCC) Feasibility Study is committed to investigating the technical and financial viability of such a facility at CERN. This exhibition contains information regarding this project and the importance it has for for CERN and for the future of Particle Physics. La mise à jour 2020 de la stratégie européenne pour la physique des particules prévoit que l’Europe, en collaboration avec la communauté mondiale, devra étudier la faisabilité d’un collisionneur de hadrons de prochaine génération. Dans cette perspective, l’étude de faisabilité relative au Futur collisionneur circulaire (FCC) a pour objectif de déterminer la viabilité technique et financière d’une telle installation au CERN. Cette exposition présente des informations sur ce projet et son importance pour le CERN et pour le futur de la Physique des Particules. © CERN Geneva Fulltext

2021-11-26
10:28
NEW GENERATION OFFLINE SOFTWARE FOR THE LHCb UPGRADE I
 Reference: Poster-2021-1056 Created: 2021. -1 p Creator(s): Ferrillo, Martina The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC's Run 3, which is scheduled to begin in 2022. The increased data rate in Run 3 poses significant data-processing and handling challenges for the LHCb experiment. The offline computing and dataflow model is consequently also being upgraded to cope with the factor 30 increase in data volume and associated demands of user-data samples of ever-increasing size. Coordinating these efforts is the charge of the newly created Data Processing and Analysis (DPA) project. The DPA project is responsible for ensuring the LHCb experiment can efficiently exploit the Run 3 data, dealing with the data from the online system with central skimming/slimming (a process known as "Sprucing") and subsequently producing analyst-level ntuples with a centrally managed production system (known as "Analysis Productions") utilising improved analysis tools and infrastructure for continuous integration and validation.It is a multi-disciplinary project involving collaboration between computing experts, trigger experts and physics analysis experts. This talk will present the evolution of the data processing model, followed by a review of the various activities of the DPA project. The associated computing, storage and network requirements are also discussed. Related links:11th LHC students poster session © CERN Geneva Access to files

2021-11-26
10:25
Quantum Machine Learning at LHCb
 Reference: Poster-2021-1055 Created: 2021. -1 p Creator(s): Nicotra, Davide At the LHCb experiment, it is mandatory to identify jets produced by $b$ and $\bar{b}$ quarks (b-jet charge tagging), since it is fundamental in several Physics studies, e.g. the measurement of the $b$-$\bar{b}$ production asymmetry, which could be sensitive to New Physics channels. Being a classification problem, Machine Learning techniques, such as Deep Neural Networks, have been used to solve this problem. In this work, we present a new approach to b-jet charge tagging based on Quantum Machine Learning techniques, trained on LHCb simulated data. Performance comparisons with other classical algorithms are also presented. Related links:11th LHC students poster session © CERN Geneva Access to files

2021-09-24
14:15
Investigation of Radiation-Induced Effects in a Front-end ASIC Designed for Photon Counting Sensor Systems
 Reference: Poster-2021-1054 Created: 2021. -1 p Creator(s): Placinta, Vlad-Mihai This work outlines the measurements done to evaluate the second SPACIROC generation in ionizing radiation environments, i.e., particle beams: ions, protons, and X-rays. The SPACIROCs are front-end ASICs designed for the readout requirements of photomultiplier technologies like: SiPMs, MaPMTs. Several radiation-induced effects were observed but they proved to be benign application-wise. The threshold LET for SEUs was measured and two cross-sections for different LETs are provided. At extremely high dose rates (~100 rad/s) and TID above 50 krad proton/X-ray induced TID effects were observed, however a room-temperature annealing process was determined to mitigate the harmful TID effects in 24 hours. © CERN Geneva Access to files