Evolution of the ATLAS event data model for the HL-LHC Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.1051/epjconf/202533701118
· OA: W4414882437
The upcoming high-luminosity run of the CERN Large Hadron Collider (HL-LHC) will yield an unprecedented volume of data. In order to process this data, the ATLAS collaboration is evolving its offline software to be able to use heterogeneous resources such as graphical processing units (GPUs) and field-programmable gate arrays (FPGAs). To reduce conversion overheads, the event data model (EDM) should be compatible with the requirements of these resources. While the ATLAS EDM has long allowed representing data as a structure of arrays, further evolution of the EDM can enable more efficient sharing of data between CPU and GPU resources. Some of this work will be summarized here, including extensions to allow controlling how memory for event data is allocated and the implementation of jagged vectors.