Gianluca Detommaso
YOU?
Author Swipe
View article: Multicalibration for Confidence Scoring in LLMs
Multicalibration for Confidence Scoring in LLMs Open
This paper proposes the use of "multicalibration" to yield interpretable and reliable confidence scores for outputs generated by large language models (LLMs). Multicalibration asks for calibration not just marginally, but simultaneously ac…
View article: Multilevel dimension-independent likelihood-informed MCMC for large-scale inverse problems
Multilevel dimension-independent likelihood-informed MCMC for large-scale inverse problems Open
We present a non-trivial integration of dimension-independent likelihood-informed (DILI) MCMC (Cui et al 2016) and the multilevel MCMC (Dodwell et al 2015) to explore the hierarchy of posterior distributions. This integration offers severa…
View article: Fortuna: A Library for Uncertainty Quantification in Deep Learning
Fortuna: A Library for Uncertainty Quantification in Deep Learning Open
We present Fortuna, an open-source library for uncertainty quantification in deep learning. Fortuna supports a range of calibration techniques, such as conformal prediction that can be applied to any trained neural network to generate reli…
View article: Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors
Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors Open
As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information. We introduce Distance-Aware Prior (DAP) calibration, a method to correct…
View article: A data-centric approach to generative modelling for 3D-printed steel
A data-centric approach to generative modelling for 3D-printed steel Open
The emergence of additive manufacture (AM) for metallic material enables components of near arbitrary complexity to be produced. This has potential to disrupt traditional engineering approaches. However, metallic AM components exhibit grea…
View article: Causal Bias Quantification for Continuous Treatments
Causal Bias Quantification for Continuous Treatments Open
We extend the definition of the marginal causal effect to the continuous treatment setting and develop a novel characterization of causal bias in the framework of structural causal models. We prove that our derived bias expression is zero …
View article: Causal Bias Quantification for Continuous Treatment.
Causal Bias Quantification for Continuous Treatment. Open
In this work we develop a novel characterization of marginal causal effect and causal bias in the continuous treatment setting. We show they can be expressed as an expectation with respect to a conditional probability distribution, which c…
View article: HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference
HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference Open
Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation. While such a transformatio…
View article: A Stein variational Newton method
A Stein variational Newton method Open
Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and it…
View article: Multilevel Dimension-Independent Likelihood-Informed MCMC for Large-Scale Inverse Problems
Multilevel Dimension-Independent Likelihood-Informed MCMC for Large-Scale Inverse Problems Open
We present a non-trivial integration of dimension-independent likelihood-informed (DILI) MCMC (Cui, Law, Marzouk, 2016) and the multilevel MCMC (Dodwell et al., 2015) to explore the hierarchy of posterior distributions. This integration of…
View article: HINT: Hierarchical Invertible Neural Transport for Density Estimation\n and Bayesian Inference
HINT: Hierarchical Invertible Neural Transport for Density Estimation\n and Bayesian Inference Open
Many recent invertible neural architectures are based on coupling block\ndesigns where variables are divided in two subsets which serve as inputs of an\neasily invertible (usually affine) triangular transformation. While such a\ntransforma…
View article: HINT: Hierarchical Invertible Neural Transport for General and Sequential Bayesian inference.
HINT: Hierarchical Invertible Neural Transport for General and Sequential Bayesian inference. Open
In this paper, we introduce Hierarchical Invertible Neural Transport (HINT), an algorithm that merges Invertible Neural Networks and optimal transport to sample from a posterior distribution in a Bayesian framework. This method exploits a …
View article: Stein Variational Online Changepoint Detection with Applications to Hawkes Processes and Neural Networks
Stein Variational Online Changepoint Detection with Applications to Hawkes Processes and Neural Networks Open
Bayesian online changepoint detection (BOCPD) (Adams & MacKay, 2007) offers a rigorous and viable way to identify changepoints in complex systems. In this work, we introduce a Stein variational online changepoint detection (SVOCD) method t…
View article: Continuous Level Monte Carlo and Sample-Adaptive Model Hierarchies
Continuous Level Monte Carlo and Sample-Adaptive Model Hierarchies Open
In this paper, we present a generalisation of the Multilevel Monte Carlo (MLMC) method to a setting where the level parameter is a continuous variable. This Continuous Level Monte Carlo (CLMC) estimator provides a natural framework in PDE …
View article: A Stein variational Newton method
A Stein variational Newton method Open
Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and it…