PHYSICS PERFORMANCE AND DATASET (PPD)

The first part of the Long Shutdown period has been dedicated to the preparation of the samples for the analysis targeting the summer conferences. In particular, the 8 TeV data acquired in 2012, including most of the “parked datasets”, have been reconstructed profiting from improved alignment and calibration conditions for all the sub-detectors.

A careful planning of the resources was essential in order to deliver the datasets well in time to the analysts, and to schedule the update of all the conditions and calibrations needed at the analysis level.

The newly reprocessed data have undergone detailed scrutiny by the Dataset Certification team allowing to recover some of the data for analysis usage and further improving the certification efficiency, which is now at 91% of the recorded luminosity.

With the aim of delivering a consistent dataset for 2011 and 2012, both in terms of conditions and release (53X), the PPD team is now working to set up a data re-reconstruction and a new MC production also for the 7 TeV dataset. This will serve as legacy for all future analysis of the first LHC run.

Looking even further on the timeline, PPD is contributing to the preparation of post-LS1 data taking and supports all the various productions needed to define the upgrade strategy of the experiment. In this context, the Global Event Description team is following up the work of the POGs to prepare the reconstruction and identification algorithms to the future challenges represented by the LHC running conditions. A workshop focusing on these aspects, both in terms of development and validation, is being organised with all the relevant experts on 23 and 24 July at FNAL.

The PPD core team is also profiting from the shutdown period to develop and consolidate activity on all the central tools for conditions and monitoring. This will allow us to capitalise on the experience acquired in view of the restart of the data taking in 2015.

Alignment and Calibration and Database (AlCaDB)

Since the stop of data taking in early 2013, work in the AlCaDB project moved to a consolidation phase. On the AlCa side, the efforts mainly concentrated on providing and validating new Global Tags as needed for upgrade studies and re-processing campaigns. Improvements on the Global Tag Collector tool to manage these have started and are ongoing. On the Database side, the major redesign of the core conditions software has started and is progressing well.

Data Quality Monitoring (DQM)/Data Certification

The team has completed the certification of 2012 data, which is reprocessed with improved alignment and calibration conditions. The official JSON files have been released in May, and they correspond to a total integrated luminosity of 19.79 fb–1 in the “golden” scenario, where the data are certified as usable for analysis by all the detectors and POGs. This total includes ~170 pb–1 of data that was initially unusable for physics analysis due to issues with the Preshower (ES) and Hadron Calorimeter (HCAL) detectors, but was recovered after the reprocessing. The final luminosity-weighted certification efficiency with respect to the recorded luminosity is 91% for 2012 pp data.

As the data certification is completed, we have shifted focus towards increased performance and automation of the DQM system for 2015 data taking. A number of improvements are planned, such as multi-core and multi-thread approaches to data reconstruction and monitoring, multi-run DQM, a new file-based online DQM approach, and a significantly updated data-certification procedure to be tested during the 2013 Global Run.

Physics Data Monte-Carlo Validation (PdmV)

For the first time in CMS, a campaign of era-dependent simulation has been validated and put in production for the 8 TeV MC targeted at Higgs analysis, to reproduce as closely as possible the data-taking conditions during 2012. A first and partial reprocessing of the data collected in 2011 at 7 TeV with the 53X legacy release has been initiated after validation of the updates to the alignment and calibration conditions. The validation for the reprocessing of a corresponding subset of the 7 TeV MC is underway. Both for data and MC, the ultimate legacy dataset will be produced in the summer when the final alignment and calibration conditions and HLT simulation will be available. 

The validation of the upgrade software and conditions has been successfully performed using the standard validation procedures. The validation follows the bi-weekly release schedule of the upgrade project. The validation of the software for the Phase 1 upgrade will be extended to the validation of the simulation of the detector after various ageing and beam conditions. Validation and production of MC samples has been performed under high pressure and tight schedule for the June Upgrade Week with fruitful results.

The McM project, initiated by the generator group and PdmV as PREP2 during 2012, is in a functioning version for a replacement of the MC production management and book-keeping system (PREP). Series of tutorials have been provided and commissioning with pilot campaigns has been made. McM will be put into production as soon as possible.


by L. Silvestris, P. Azzi, C. Cerminara, with contributions from R. Castello, M. De Mattia, A. Pfeiffer, F. De Guio, D. Duggan, M. Rovere, G. Boudoul, G. Franzoni, J.-R. Vlimant. Edited by K. Aspola.