Particle Physics

High Luminosity LHC

How do ESCAPE services support the High Luminosity Large Hadron Collider scientific programme?

Geneva, Switzerland

The High-Luminosity LHC (HL-LHC) is a major upgrade of the Large Hadron Collider (LHC). The LHC collides tiny particles of matter (protons) at an energy of 13 TeV in order to study the fundamental components of matter and the forces that bind them together. The High-Luminosity LHC will make it possible to study these in more detail by increasing the number of collisions by a factor of between five and seven.

LH-LHC Scientific Challenges

CERN's main focus is particle physics – the study of the fundamental constituents of matter – but the physics programme at the laboratory is much broader, ranging from nuclear to high-energy physics, from studies of antimatter to the possible effects of cosmic rays on clouds.

Since the 1970s, particle physicists have described the fundamental structure of matter using an elegant series of equations called the Standard Model. The model describes how everything that they observe in the universe is made from a few basic blocks called fundamental particles, governed by four forces. Physicists at CERN use the world's most powerful particle accelerators and detectors to test the predictions and limits of the Standard Model. Over the years it has explained many experimental results and precisely predicted a range of phenomena, such that today it is considered a well-tested physics theory.

But the model only describes the 4% of the known universe, and questions remain. Will we see a unification of forces at the high energies of the Large Hadron Collider(LHC)? Why is gravity so weak? Why is there more matter than antimatter in the universe? Is there more exotic physics waiting to be discovered at higher energies? Will we discover evidence for a theory called supersymmetry at the LHC? Or understand the Higgs boson that gives particles mass?

 

ATLAS Experiment © 2022 CERN

ESCAPE Impact to HL-LHC

The ESCAPE ideas and technologies clearly marked a step forward to foster and nurture the capabilities and potential of Open Science and Open Data in the scientific and outreach community. Together with the prototyped Analysis Frameworks, ESCAPE is pioneering a new dimension in the way scientific results are achieved and shared.

Bringing together experiment’s big data with heterogeneous computing resources through content delivery and caching is one of the objectives to rationalise and improve the way disk storage is handled, and the ESCAPE DIOS and ESCAPE Data Management technologies provides the necessary abstraction to achieve it. Another aspect is to connect the large user community with the experiment’s data repositories through integrated environments, where users find the necessary toolset to perform analysis. The integration of a Scientific Data Lake with web-based notebooks, through simple mechanisms, iron out the inherent complexities of Distributed Computing systems.

ESCAPE allows HL-LHC to demonstrate a very new data management model in collaboration with other astronomy, astroparticle and high energy physics experiments.

Both HL-LHC experiments present in ESCAPE: ATLAS and CMS achieved relevant results on their main fields of interest. A proof of concept to deliver data to HPC resources was carried out by CMS, validating the model by bridging the Data Lake with CINECA HPC centre via a Storage Data-Caching layer. Also a first evaluation of the capability of the Data Lake to manage “embargoed data” for CMS experiment has been tested with success, allowing access to a subset of data only to specific users, the authorization workflows were tested for grid certificates and token based authentication protocols. ATLAS explored and boosted the abilities of data interaction of Open Access datasets.  Data downloading, uploading and bookkeeping of analysis files present in the ESCAPE Data Lake was exercised by simulating a data-augmentation process (from a few hundred GBs of data to single-digit TBs of data). The onboarding of the ATLAS Analysis Object Data C++ analysis framework in the ESCAPEOSSR was completed. This step was relevant  in order to have a fully reproducible analysis, including the datasets, specific software, required pipelines and workflows.

So how is ESCAPE prototyping new data management services for the HL-LHC?

ESCAPE DIOS is building and deploying the large-scale prototype of the data lake Exascale data infrastructure. The data lake is a policy-driven, reliable, and distributed data infrastructure capable of managing Exabyte-scale data sets, and able to deliver data on-demand at low latency to all types of processing facilities. All of the data organisation, management and access services are being developed and tested within ESCAPE DIOS in collaboration with other key science projects. Furthermore, ESCAPE DIOS will be one of the key deliveries to show FAIR data management support and access in ESCAPE.

The ESCAPE ESAP focuses on the deployment of some key use cases, and the supporting analysis tools. On the other hand, ESCAPE OSSR provides HL-LHC with a repository for all the software and services that need to be used, while also making them available to the broader EOSC community.

The ESCAPE CS can also support the LHC@Home Citizen Science project by attracting new volunteer citizen scientists to contribute to LHC data processing, in the search for new fundamental particles and answers to questions about the Universe.

HL-LHC expectations towards ESCAPE

A continuous collaboration and cooperation among ESFRIs from different sciences domains ensure a coherent growth of Data Management and Distributed Computing tools and frameworks, this coherence and cross fertilisation need to be pursued and cemented in the next phase of ESCAPE as a Collaboration Agreement among the different leading ESFRIs covering different Physics Sciences domains.


LEARN MORE ABOUT HOW ESCAPE IS SUPPORTING THE OTHER RESEARCH INFRASTRUCTURES