Stratis Markou
YOU?
Author Swipe
View article: On the effective resolution of AI weather models
On the effective resolution of AI weather models Open
In recent years, models based on artificial intelligence (AI) have become equally good or even slightly better at predicting the weather as standard operational models, which are based on solving physical equations. Although the grid size …
View article: Skillful joint probabilistic weather forecasting from marginals
Skillful joint probabilistic weather forecasting from marginals Open
Machine learning (ML)-based weather models have rapidly risen to prominence due to their greater accuracy and speed than traditional forecasts based on numerical weather prediction (NWP), recently outperforming traditional ensembles in glo…
View article: End-to-end data-driven weather forecasting. (source code, sample data and trained models)
End-to-end data-driven weather forecasting. (source code, sample data and trained models) Open
This resource contains the key source code, some sample data and the trained models from the paper: "End-to-end data-driven weather forecasting."
View article: Translation Equivariant Transformer Neural Processes
Translation Equivariant Transformer Neural Processes Open
The effectiveness of neural processes (NPs) in modelling posterior prediction maps -- the mapping from data to posterior predictive distributions -- has significantly improved since their inception. This improvement can be attributed to tw…
View article: Noise-Aware Differentially Private Regression via Meta-Learning
Noise-Aware Differentially Private Regression via Meta-Learning Open
Many high-stakes applications require machine learning models that protect user privacy and provide well-calibrated, accurate predictions. While Differential Privacy (DP) is the gold standard for protecting user privacy, standard DP mechan…
View article: Variance-Reducing Couplings for Random Features
Variance-Reducing Couplings for Random Features Open
Random features (RFs) are a popular technique to scale up kernel methods in machine learning, replacing exact kernel evaluations with stochastic Monte Carlo estimates. They underpin models as diverse as efficient transformers (by approxima…
View article: Denoising Diffusion Probabilistic Models in Six Simple Steps
Denoising Diffusion Probabilistic Models in Six Simple Steps Open
Denoising Diffusion Probabilistic Models (DDPMs) are a very popular class of deep generative model that have been successfully applied to a diverse range of problems including image and video generation, protein and material synthesis, wea…
View article: Faster Relative Entropy Coding with Greedy Rejection Coding
Faster Relative Entropy Coding with Greedy Rejection Coding Open
Relative entropy coding (REC) algorithms encode a sample from a target distribution $Q$ using a proposal distribution $P$ using as few bits as possible. Unlike entropy coding, REC does not assume discrete distributions or require quantisat…
View article: Autoregressive Conditional Neural Processes
Autoregressive Conditional Neural Processes Open
Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractive meta-learning models which produce well-calibrated predictions and are trainable via a simple maximum likelihood procedure. Although CNPs have many advantages, they …
View article: Trieste: Efficiently Exploring The Depths of Black-box Functions with TensorFlow
Trieste: Efficiently Exploring The Depths of Black-box Functions with TensorFlow Open
We present Trieste, an open-source Python package for Bayesian optimization and active learning benefiting from the scalability and efficiency of TensorFlow. Our library enables the plug-and-play of popular TensorFlow-based models within s…
View article: Environmental sensor placement with convolutional Gaussian neural processes
Environmental sensor placement with convolutional Gaussian neural processes Open
Environmental sensors are crucial for monitoring weather conditions and the impacts of climate change. However, it is challenging to place sensors in a way that maximises the informativeness of their measurements, particularly in remote re…
View article: Environmental Sensor Placement with Convolutional Gaussian Neural Processes
Environmental Sensor Placement with Convolutional Gaussian Neural Processes Open
Environmental sensors are crucial for monitoring weather conditions and the impacts of climate change. However, it is challenging to place sensors in a way that maximises the informativeness of their measurements, particularly in remote re…
View article: Notes on the runtime of A* sampling
Notes on the runtime of A* sampling Open
The challenge of simulating random variables is a central problem in Statistics and Machine Learning. Given a tractable proposal distribution $P$, from which we can draw exact samples, and a target distribution $Q$ which is absolutely cont…
View article: Practical Conditional Neural Processes Via Tractable Dependent Predictions
Practical Conditional Neural Processes Via Tractable Dependent Predictions Open
Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning models which leverage the flexibility of deep learning to produce well-calibrated predictions and naturally handle off-the-grid and missing data. CNPs scale to la…
View article: Practical Conditional Neural Processes Via Tractable Dependent Predictions
Practical Conditional Neural Processes Via Tractable Dependent Predictions Open
Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning models which leverage the flexibility of deep learning to produce well-calibrated predictions and naturally handle off-the-grid and missing data. CNPs scale to la…
View article: Partitioned Variational Inference: A Framework for Probabilistic Federated Learning
Partitioned Variational Inference: A Framework for Probabilistic Federated Learning Open
The proliferation of computing devices has brought about an opportunity to deploy machine learning models on new problem domains using previously inaccessible data. Traditional algorithms for training such models often require data to be s…
View article: Fast Relative Entropy Coding with A* coding
Fast Relative Entropy Coding with A* coding Open
Relative entropy coding (REC) algorithms encode a sample from a target distribution $Q$ using a proposal distribution $P$, such that the expected codelength is $\mathcal{O}(D_{KL}[Q \,||\, P])$. REC can be seamlessly integrated with existi…
View article: Efficient Gaussian Neural Processes for Regression
Efficient Gaussian Neural Processes for Regression Open
Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attractive family of meta-learning models which produce well-calibrated predictions, enable fast inference at test time, and are trainable via a simple maximum likelihood proc…