O. Gutsche
YOU?
Author Swipe
View article: Application of performance portability solutions for GPUs and many-core CPUs to track reconstruction kernels
Application of performance portability solutions for GPUs and many-core CPUs to track reconstruction kernels Open
Next generation High-Energy Physics (HEP) experiments are presented with significant computational challenges, both in terms of data volume and processing power. Using compute accelerators, such as GPUs, is one of the promising ways to pro…
View article: A Ceph S3 Object Data Store for HEP
A Ceph S3 Object Data Store for HEP Open
In CMS, data access and management is organized around the data tier model: a static definition of what subset of event information is available in a particular dataset, realized as a collection of files. We present a novel data management…
View article: Application of performance portability solutions for GPUs and many-core CPUs to track reconstruction kernels
Application of performance portability solutions for GPUs and many-core CPUs to track reconstruction kernels Open
Next generation High-Energy Physics (HEP) experiments are presented with significant computational challenges, both in terms of data volume and processing power. Using compute accelerators, such as GPUs, is one of the promising ways to pro…
View article: The U.S. CMS HL-LHC R&D Strategic Plan
The U.S. CMS HL-LHC R&D Strategic Plan Open
The HL-LHC run is anticipated to start at the end of this decade and will pose a significant challenge for the scale of the HEP software and computing infrastructure. The mission of the U.S. CMS Software & Computing Operations Program is t…
View article: Automated Network Services for Exascale Data Movement
Automated Network Services for Exascale Data Movement Open
The Large Hadron Collider (LHC) experiments distribute data by leveraging a diverse array of National Research and Education Networks (NRENs), where experiment data management systems treat networks as a “blackbox” resource. After the High…
View article: The U.S. CMS HL-LHC R&D Strategic Plan
The U.S. CMS HL-LHC R&D Strategic Plan Open
The HL-LHC run is anticipated to start at the end of this decade and will pose a significant challenge for the scale of the HEP software and computing infrastructure. The mission of the U.S. CMS Software & Computing Operations Program is t…
View article: A Ceph S3 Object Data Store for HEP
A Ceph S3 Object Data Store for HEP Open
We present a novel data format design that obviates the need for data tiers by storing individual event data products in column objects. The objects are stored and retrieved through Ceph S3 technology, with a layout designed to minimize me…
View article: Evaluating Portable Parallelization Strategies for Heterogeneous Architectures in High Energy Physics
Evaluating Portable Parallelization Strategies for Heterogeneous Architectures in High Energy Physics Open
High-energy physics (HEP) experiments have developed millions of lines of code over decades that are optimized to run on traditional x86 CPU systems. However, we are seeing a rapidly increasing fraction of floating point computing power in…
View article: Detector R&D needs for the next generation $e^+e^-$ collider
Detector R&D needs for the next generation $e^+e^-$ collider Open
The 2021 Snowmass Energy Frontier panel wrote in its final report "The realization of a Higgs factory will require an immediate, vigorous and targeted detector R&D program". Both linear and circular $e^+e^-$ collider efforts have developed…
View article: Object Stores for CMS data
Object Stores for CMS data Open
what kind of events."e.g.hard scatter process for simulation, trigger filter for data AOD Data columns pertaining to low-level reconstruction MiniAOD Calibrated physics objects Particle-flow candidates … .
View article: The U.S. CMS HL-LHC R&D Strategic Plan
The U.S. CMS HL-LHC R&D Strategic Plan Open
sensors, and 130 μm for 3D sensors, the thinnest ones ever produced so far. The first prototypes of hybrid modules, bump-bonded to the present CMS readout chip, have been tested on beam. The first results on their performance before and af…
View article: The Future of High Energy Physics Software and Computing (V2.1)
The Future of High Energy Physics Software and Computing (V2.1) Open
Software and Computing (S&C) are essential to all High Energy Physics (HEP) experiments and many theoretical studies. The size and complexity of S&C are now commensurate with that of experimental instruments, playing a critical role in exp…
View article: Object Stores for CMS data [Poster]
Object Stores for CMS data [Poster] Open
in progress, each scanning at rates exceeding 10 megabytes per second, all of which are sharing access to a very large persistent address space distributed across multiple disks on multiple hosts. Specifically, POPM employs the following t…
View article: The Future of High Energy Physics Software and Computing
The Future of High Energy Physics Software and Computing Open
Software and Computing (S&C) are essential to all High Energy Physics (HEP) experiments and many theoretical studies. The size and complexity of S&C are now commensurate with that of experimental instruments, playing a critical role in exp…
View article: Development of the trigger algorithm for the MONOLITH experiment
Development of the trigger algorithm for the MONOLITH experiment Open
The MONOLITH project is proposed to prove atmospheric neutrino oscillations and to improve the corresponding measurements of Super-Kamiokande. The MONOLITH detector consists of a massive (34 kt) magnetized iron tracking calorimeter and is …
View article: Portability: A Necessary Approach for Future Scientific Software
Portability: A Necessary Approach for Future Scientific Software Open
Today's world of scientific software for High Energy Physics (HEP) is powered by x86 code, while the future will be much more reliant on accelerators like GPUs and FPGAs. The portable parallelization strategies (PPS) project of the High En…
View article: Portability: A Necessary Approach for Future Scientific Software
Portability: A Necessary Approach for Future Scientific Software Open
Today's world of scientific software for High Energy Physics (HEP) is powered by x86 code, while the future will be much more reliant on accelerators like GPUs and FPGAs. The portable parallelization strategies (PPS) project of the High En…
View article: HEP Software Foundation Community White Paper Working Group -- Data Organization, Management and Access (DOMA)
HEP Software Foundation Community White Paper Working Group -- Data Organization, Management and Access (DOMA) Open
Without significant changes to data organization, management, and access (DOMA), HEP experiments will find scientific output limited by how fast data can be accessed and digested by computational resources. In this white paper we discuss c…
View article: Coffea -- Columnar Object Framework For Effective Analysis
Coffea -- Columnar Object Framework For Effective Analysis Open
The coffea framework provides a new approach to High-Energy Physics analysis, via columnar operations, that improves time-to-insight, scalability, portability, and reproducibility of analysis. It is implemented with the Python programming …
View article: Striped Data Analysis Framework
Striped Data Analysis Framework Open
A columnar data representation is known to be an efficient way for data storage, specifically in cases when the analysis is often done based only on a small fragment of the available data structures. A data representation like Apache Parqu…
View article: Striped Data Analysis Framework
Striped Data Analysis Framework Open
Traditionally, High Energy data analysis is based on the model where data are stored in files and analyzed by running multiple analysis processes, each reading one or more of the data files. This process involves repeated data reduction st…
View article: Response to NITRD, NCO, NSF Request for Information on "Update to the 2016 National Artificial Intelligence Research and Development Strategic Plan"
Response to NITRD, NCO, NSF Request for Information on "Update to the 2016 National Artificial Intelligence Research and Development Strategic Plan" Open
We present a response to the 2018 Request for Information (RFI) from the NITRD, NCO, NSF regarding the "Update to the 2016 National Artificial Intelligence Research and Development Strategic Plan." Through this document, we provide a respo…
View article: COFFEA - Columnar Object Framework For Effective Analysis [Slides]
COFFEA - Columnar Object Framework For Effective Analysis [Slides] Open
The COFFEA Framework provides a new approach to HEP analysis, via columnar operations, that improves time-to-insight, scalability, portability, and reproducibility of analysis. It is implemented with the Python programming language and com…
View article: New Technologies for Discovery
New Technologies for Discovery Open
For the field of high energy physics to continue to have a bright future, priority within the field must be given to investments in the development of both evolutionary and transformational detector development that is coordinated across t…
View article: The Case for Columnar Analysis (a Two-Part Series) [PowerPoint]
The Case for Columnar Analysis (a Two-Part Series) [PowerPoint] Open
This talk provides prologue terminology and technology. Part I covers analyzer experience including user experience, code samples, domain of applicability, and scalability. Part II discusses technical underpinnings including theoretical mo…
View article: Using Big Data Technologies for HEP Analysis
Using Big Data Technologies for HEP Analysis Open
The HEP community is approaching an era were the excellent performances of the particle accelerators in delivering collision at high rate will force the experiments to record a large amount of information. The growing size of the datasets …
View article: HPC resource integration into CMS Computing via HEPCloud
HPC resource integration into CMS Computing via HEPCloud Open
The higher energy and luminosity from the LHC in Run 2 have put increased pressure on CMS computing resources. Extrapolating to even higher luminosities (and thus higher event complexities and trigger rates) beyond Run 3, it becomes clear …
View article: CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program Open
The high-luminosity program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computi…
View article: HEP Software Foundation Community White Paper Working Group -- Data Organization, Management and Access (DOMA)
HEP Software Foundation Community White Paper Working Group -- Data Organization, Management and Access (DOMA) Open
Without significant changes to data organization, management, and access (DOMA), HEP experiments will find scientific output limited by how fast data can be accessed and digested by computational resources. In this white paper we discuss c…
View article: HEP Software Foundation Community White Paper Working Group --- Visualization
HEP Software Foundation Community White Paper Working Group --- Visualization Open
In modern High Energy Physics (HEP) experiments visualization of experimental data has a key role in many activities and tasks across the whole data chain: from detector development to monitoring, from event generation to reconstruction of…