Kai Keller
YOU?
Author Swipe
View article: Replicability in Earth System Models
Replicability in Earth System Models Open
Climate simulations with Earth System Models (ESMs) constitute the basis of our knowledge about the projected climate change for the coming decades. They represent the major source of knowledge for the Intergovernmental Panel on Climate Ch…
View article: The Destination Earth digital twin for climate change adaptation
The Destination Earth digital twin for climate change adaptation Open
The Climate Change Adaptation Digital Twin (Climate DT), developed as part of the European Commission’s Destination Earth (DestinE) initiative, is a pioneering effort to build an operational climate information system in support of adaptat…
View article: Replicability in Earth System Models
Replicability in Earth System Models Open
Climate simulations with Earth System Models (ESMs) constitute the basis of our knowledge about the projected climate change for the coming decades. They represent the major source of knowledge for the Intergovernmental Panel on Climate Ch…
View article: Earth system model replicability - Statistical validation of a model's climate under a change of computing environment
Earth system model replicability - Statistical validation of a model's climate under a change of computing environment Open
The sixth assessment report (AR6) issued by the Intergovernmental Panel on Climate Change (IPCC) projects that 1 in 50 years, heat waves become about 8 times more frequent, and 1 in 10 years, extreme precipitation events become twice as fr…
View article: Scientific developments of IFS-NEMO for Destination Earth’s Climate Adaptation Digital Twin
Scientific developments of IFS-NEMO for Destination Earth’s Climate Adaptation Digital Twin Open
The Climate Adaptation Digital Twin within the Destination Earth project represents an innovative initiative aimed at achieving operational kilometer-scale global climate simulations to support climate adaptation efforts. Three state-of-th…
View article: A Framework for Automatic Validation and Application of Lossy Data Compression in Ensemble Data Assimilation
A Framework for Automatic Validation and Application of Lossy Data Compression in Ensemble Data Assimilation Open
Ensemble data assimilation techniques form an indispensable part of numerical weather prediction. As the ensemble size grows and model resolution increases, the amount of required storage becomes a major issue. Data compression schemes may…
View article: The Backbone of the Destination Earth Climate Adaptation Digital Twin
The Backbone of the Destination Earth Climate Adaptation Digital Twin Open
Since the first concerns were raised in the 1980s that the climate may undergo catastrophic changes caused by the increasing greenhouse gas emissions[1,2], a number of multilateral efforts have been brought to life for investigating the fu…
View article: A Framework for Large Scale Particle Filters Validated with Data Assimilation for Weather Simulation
A Framework for Large Scale Particle Filters Validated with Data Assimilation for Weather Simulation Open
Particle filters are a group of algorithms to solve inverse problems through statistical Bayesian methods when the model does not comply with the linear and Gaussian hypothesis. Particle filters are used in domains like data assimilation, …
View article: Resilience for large ensemble computations
Resilience for large ensemble computations Open
With the increasing power of supercomputers, ever more detailed models of physical systems can be simulated, and ever larger problem sizes can be considered for any kind of numerical system. During the last twenty years the performance of …
View article: Towards Zero-Waste Recovery and Zero-Overhead Checkpointing in Ensemble Data Assimilation
Towards Zero-Waste Recovery and Zero-Overhead Checkpointing in Ensemble Data Assimilation Open
Ensemble data assimilation is a powerful tool for increasing the accuracy of climatological states. It is based on combining observations with the results from numerical model simulations. The method comprises two steps, (1) the propagatio…
View article: Design and Study of Elastic Recovery in HPC Applications
Design and Study of Elastic Recovery in HPC Applications Open
The efficient utilization of current supercomputing systems with deep storage hierarchies demands scientific applications that are capable of leveraging such heterogeneous hardware. Fault tolerance, and checkpointing in particular, is one …
View article: Checkpoint Restart Support for Heterogeneous HPC Applications
Checkpoint Restart Support for Heterogeneous HPC Applications Open
As we approach the era of exa-scale computing, fault tolerance is of growing importance. The increasing number of cores as well as the increased complexity of modern heterogenous systems result in substantial decrease of the expected mean …
View article: Application-Level Differential Checkpointing for HPC Applications with Dynamic Datasets
Application-Level Differential Checkpointing for HPC Applications with Dynamic Datasets Open
High-performance computing (HPC) requires resilience techniques such as checkpointing in order to tolerate failures in supercomputers. As the number of nodes and memory in supercomputers keeps on increasing, the size of checkpoint data als…
View article: Application-Level Differential Checkpointing for HPC Applications with Dynamic Datasets
Application-Level Differential Checkpointing for HPC Applications with Dynamic Datasets Open
High-performance computing (HPC) requires resilience techniques such as checkpointing in order to tolerate failures in supercomputers. As the number of nodes and memory in supercomputers keeps on increasing, the size of checkpoint data als…
View article: Towards Ad Hoc Recovery for Soft Errors
Towards Ad Hoc Recovery for Soft Errors Open
The coming exascale era is a great opportunity for high performance computing (HPC) applications. However, high failure rates on these systems will hazard the successful completion of their execution. Bit-flip errors in dynamic random acce…