Romain Égelé
YOU?
Author Swipe
View article: Automated Learning of GNN Ensembles for Predicting Redox Potentials with Uncertainty
Automated Learning of GNN Ensembles for Predicting Redox Potentials with Uncertainty Open
Accurate prediction of redox potentials of iron (Fe) complexes, in tandem with uncertainty quantification, is essential to advance technologies related to electro-deposition and energy storage by enabling reliable modeling, guiding experim…
View article: Automated Learning of GNN Ensembles for Predicting Redox Potentials with Uncertainty
Automated Learning of GNN Ensembles for Predicting Redox Potentials with Uncertainty Open
Accurate prediction of redox potentials of iron (Fe) complexes, in tandem with uncertainty quantification, is essential to advance technologies related to electro-deposition and energy storage by enabling reliable modeling, guiding experim…
View article: The unreasonable effectiveness of early discarding after one epoch in neural network hyperparameter optimization
The unreasonable effectiveness of early discarding after one epoch in neural network hyperparameter optimization Open
To reach high performance with deep learning, hyperparameter optimization (HPO) is essential. This process is usually time-consuming due to costly evaluations of neural networks. Early discarding techniques limit the resources granted to u…
View article: Refining computer tomography data with super-resolution networks to increase the accuracy of respiratory flow simulations
Refining computer tomography data with super-resolution networks to increase the accuracy of respiratory flow simulations Open
Accurately computing the flow in the nasal cavity with computational fluid dynamics (CFD) simulations requires highly resolved computational meshes based on anatomically realistic geometries. Such geometries can only be obtained from compu…
View article: Streamlining Ocean Dynamics Modeling with Fourier Neural Operators: A Multiobjective Hyperparameter and Architecture Optimization Approach
Streamlining Ocean Dynamics Modeling with Fourier Neural Operators: A Multiobjective Hyperparameter and Architecture Optimization Approach Open
Training an effective deep learning model to learn ocean processes involves careful choices of various hyperparameters. We leverage DeepHyper’s advanced search algorithms for multiobjective optimization, streamlining the development of neu…
View article: Optimizing Distributed Training on Frontier for Large Language Models
Optimizing Distributed Training on Frontier for Large Language Models Open
Large language models (LLMs) have demonstrated remarkable success as foundational models, benefiting various downstream applications through fine-tuning. Loss scaling studies have demonstrated the superior performance of larger LLMs compar…
View article: AI Competitions and Benchmarks: Dataset Development
AI Competitions and Benchmarks: Dataset Development Open
Machine learning is now used in many applications thanks to its ability to predict, generate, or discover patterns from large quantities of data. However, the process of collecting and transforming data for practical use is intricate. Even…
View article: Streamlining Ocean Dynamics Modeling with Fourier Neural Operators: A Multiobjective Hyperparameter and Architecture Optimization Approach
Streamlining Ocean Dynamics Modeling with Fourier Neural Operators: A Multiobjective Hyperparameter and Architecture Optimization Approach Open
Training an effective deep learning model to learn ocean processes involves careful choices of various hyperparameters. We leverage the advanced search algorithms for multiobjective optimization in DeepHyper, a scalable hyperparameter opti…
View article: Optimizing Distributed Training on Frontier for Large Language Models
Optimizing Distributed Training on Frontier for Large Language Models Open
Large language models (LLMs) have demonstrated remarkable success as foundational models, benefiting various downstream applications through fine-tuning. Recent studies on loss scaling have demonstrated the superior performance of larger L…
View article: HHD-Ethiopic A Historical Handwritten Dataset for Ethiopic OCR with Baseline Models and Human-level Performance
HHD-Ethiopic A Historical Handwritten Dataset for Ethiopic OCR with Baseline Models and Human-level Performance Open
This paper introduces HHD-Ethiopic, a new OCR dataset for historical handwritten Ethiopic script, characterized by a unique syllabic writing system, low resource availability, and complex orthographic diacritics. The dataset consists of ro…
View article: Parallel Multi-Objective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives
Parallel Multi-Objective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives Open
Machine learning (ML) methods offer a wide range of configurable hyperparameters that have a significant influence on their performance. While accuracy is a commonly used performance objective, in many settings, it is not sufficient. Optim…
View article: Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?
Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization? Open
Hyperparameter optimization (HPO) is crucial for fine-tuning machine learning models but can be computationally expensive. To reduce costs, Multi-fidelity HPO (MF-HPO) leverages intermediate accuracy levels in the learning process and disc…
View article: Quantifying uncertainty for deep learning based forecasting and flow-reconstruction using neural architecture search ensembles
Quantifying uncertainty for deep learning based forecasting and flow-reconstruction using neural architecture search ensembles Open
Classical problems in computational physics such as data-driven forecasting and signal reconstruction from sparse sensors have recently seen an explosion in deep neural network (DNN) based algorithmic approaches. However, most DNN models d…
View article: Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?
Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization? Open
Hyperparameter optimization (HPO) is crucial for fine-tuning machine learning models, but it can be computationally expensive.To reduce costs, Multi-fidelity HPO (MF-HPO) leverages intermediate accuracy levels in the learning process and d…
View article: HPC Storage Service Autotuning Using Variational- Autoencoder -Guided Asynchronous Bayesian Optimization
HPC Storage Service Autotuning Using Variational- Autoencoder -Guided Asynchronous Bayesian Optimization Open
Distributed data storage services tailored to specific applications have grown popular in the high-performance computing (HPC) community as a way to address I/O and storage challenges. These services offer a variety of specific interfaces,…
View article: AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification
AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification Open
International audience
View article: Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization
Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization Open
Bayesian optimization (BO) is a promising approach for hyperparameter optimization of deep neural networks (DNNs), where each model training can take minutes to hours. In BO, a computationally cheap surrogate model is employed to learn the…
View article: AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification
AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification Open
Deep neural networks are powerful predictors for a variety of tasks. However, they do not capture uncertainty directly. Using neural network ensembles to quantify uncertainty is competitive with approaches based on Bayesian neural networks…
View article: AgEBO-tabular
AgEBO-tabular Open
International audience
View article: Recurrent Neural Network Architecture Search for Geophysical Emulation
Recurrent Neural Network Architecture Search for Geophysical Emulation Open
Developing surrogate geophysical models from data is a key research topic in atmospheric and oceanic modeling because of the large computational costs associated with numerical simulation methods. Researchers have started applying a wide r…
View article: AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data
AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data Open
Developing high-performing predictive models for large tabular data sets is a challenging task. The state-of-the-art methods are based on expert-developed model ensembles from different supervised learning methods. Recently, automated mach…