Daniel Zügner
YOU?
Author Swipe
View article: Probing the Limit of Heat Transfer in Inorganic Crystals with Deep Learning
Probing the Limit of Heat Transfer in Inorganic Crystals with Deep Learning Open
The record includes the MatterK dataset and the DFT verification results in https://arxiv.org/abs/2503.11568.
View article: MatterSim: A Deep Learning Atomistic Model Across Elements, Temperatures and Pressures
MatterSim: A Deep Learning Atomistic Model Across Elements, Temperatures and Pressures Open
Accurate and fast prediction of materials properties is central to the digital transformation of materials design. However, the vast design space and diverse operating conditions pose significant challenges for accurately modeling arbitrar…
View article: MatterGen: a generative model for inorganic materials design
MatterGen: a generative model for inorganic materials design Open
The design of functional materials with desired properties is essential in driving technological advances in areas like energy storage, catalysis, and carbon capture. Generative models provide a new paradigm for materials design by directl…
View article: Adversarial Training for Graph Neural Networks: Pitfalls, Solutions, and New Directions
Adversarial Training for Graph Neural Networks: Pitfalls, Solutions, and New Directions Open
Despite its success in the image domain, adversarial training did not (yet) stand out as an effective defense for Graph Neural Networks (GNNs) against graph structure perturbations. In the pursuit of fixing adversarial training (1) we show…
View article: Training Differentially Private Graph Neural Networks with Random Walk Sampling
Training Differentially Private Graph Neural Networks with Random Walk Sampling Open
Deep learning models are known to put the privacy of their training data at risk, which poses challenges for their safe and ethical release to the public. Differentially private stochastic gradient descent is the de facto standard for trai…
View article: On the Robustness and Anomaly Detection of Sparse Neural Networks
On the Robustness and Anomaly Detection of Sparse Neural Networks Open
The robustness and anomaly detection capability of neural networks are crucial topics for their safe adoption in the real-world. Moreover, the over-parameterization of recent networks comes with high computational costs and raises question…
View article: Winning the Lottery Ahead of Time: Efficient Early Network Pruning
Winning the Lottery Ahead of Time: Efficient Early Network Pruning Open
Pruning, the task of sparsifying deep neural networks, received increasing attention recently. Although state-of-the-art pruning methods extract highly sparse models, they neglect two main challenges: (1) the process of finding these spars…
View article: Monte Carlo EM for Deep Time Series Anomaly Detection
Monte Carlo EM for Deep Time Series Anomaly Detection Open
Time series data are often corrupted by outliers or other kinds of anomalies. Identifying the anomalous points can be a goal on its own (anomaly detection), or a means to improving performance of other time series tasks (e.g. forecasting).…
View article: Robustness of Graph Neural Networks at Scale
Robustness of Graph Neural Networks at Scale Open
Graph Neural Networks (GNNs) are increasingly important given their popularity and the diversity of applications. Yet, existing studies of their vulnerability to adversarial attacks rely on relatively small graphs. We address this gap and …
View article: Graph Posterior Network: Bayesian Predictive Uncertainty for Node\n Classification
Graph Posterior Network: Bayesian Predictive Uncertainty for Node\n Classification Open
The interdependence between nodes in graphs is key to improve class\npredictions on nodes and utilized in approaches like Label Propagation (LP) or\nin Graph Neural Networks (GNN). Nonetheless, uncertainty estimation for\nnon-independent n…
View article: Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification
Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification Open
The interdependence between nodes in graphs is key to improve class predictions on nodes and utilized in approaches like Label Propagation (LP) or in Graph Neural Networks (GNN). Nonetheless, uncertainty estimation for non-independent node…
View article: A Study of Joint Graph Inference and Forecasting
A Study of Joint Graph Inference and Forecasting Open
We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series. The core assumption behind these models is that there is a latent graph between the time series (nodes) that gove…
View article: On Out-of-distribution Detection with Energy-based Models
On Out-of-distribution Detection with Energy-based Models Open
Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data. Energy-based models (EBMs) are flexible, unnormalized density models which seem to be able…
View article: Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions.
Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions. Open
Uncertainty awareness is crucial to develop reliable machine learning models. In this work, we propose the Natural Posterior Network (NatPN) for fast and high-quality uncertainty estimation for any task where the target distribution belong…
View article: Natural Posterior Network: Deep Bayesian Uncertainty for Exponential Family Distributions
Natural Posterior Network: Deep Bayesian Uncertainty for Exponential Family Distributions Open
Uncertainty awareness is crucial to develop reliable machine learning models. In this work, we propose the Natural Posterior Network (NatPN) for fast and high-quality uncertainty estimation for any task where the target distribution belong…
View article: Language-Agnostic Representation Learning of Source Code from Structure\n and Context
Language-Agnostic Representation Learning of Source Code from Structure\n and Context Open
Source code (Context) and its parsed abstract syntax tree (AST; Structure)\nare two complementary representations of the same computer program.\nTraditionally, designers of machine learning models have relied predominantly\neither on Struc…
View article: Language-Agnostic Representation Learning of Source Code from Structure and Context
Language-Agnostic Representation Learning of Source Code from Structure and Context Open
Source code (Context) and its parsed abstract syntax tree (AST; Structure) are two complementary representations of the same computer program. Traditionally, designers of machine learning models have relied predominantly either on Structur…
View article: Reliable Graph Neural Networks via Robust Aggregation
Reliable Graph Neural Networks via Robust Aggregation Open
Perturbations targeting the graph structure have proven to be extremely effective in reducing the performance of Graph Neural Networks (GNNs), and traditional defenses such as adversarial training do not seem to be able to improve robustne…
View article: Evaluating Robustness of Predictive Uncertainty Estimation: Are Dirichlet-based Models Reliable?
Evaluating Robustness of Predictive Uncertainty Estimation: Are Dirichlet-based Models Reliable? Open
Dirichlet-based uncertainty (DBU) models are a recent and promising class of uncertainty-aware models. DBU models predict the parameters of a Dirichlet distribution to provide fast, high-quality uncertainty estimates alongside with class p…
View article: Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts
Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts Open
Accurate estimation of aleatoric and epistemic uncertainty is crucial to build safe and reliable systems. Traditional approaches, such as dropout and ensemble methods, estimate uncertainty by sampling probability predictions from different…
View article: Group Centrality Maximization for Large-scale Graphs
Group Centrality Maximization for Large-scale Graphs Open
The study of vertex centrality measures is a key aspect of network analysis. Naturally, such centrality measures have been generalized to groups of vertices; for popular measures it was shown that the problem of finding the most central gr…
View article: Adversarial Attacks on Neural Networks for Graph Data
Adversarial Attacks on Neural Networks for Graph Data Open
Deep learning models for graphs have achieved strong performance for the task of node classification. Despite their proliferation, currently there is no study of their robustness to adversarial attacks. Yet, in domains where they are likel…
View article: Adversarial Attacks on Graph Neural Networks via Meta Learning
Adversarial Attacks on Graph Neural Networks via Meta Learning Open
Deep learning models for graphs have advanced the state of the art on many tasks. Despite their recent success, little is known about their robustness. We investigate training time attacks on graph neural networks for node classification t…
View article: Adversarial Attacks on Graph Neural Networks via Meta Learning
Adversarial Attacks on Graph Neural Networks via Meta Learning Open
Deep learning models for graphs have advanced the state of the art on many tasks. Despite their recent success, little is known about their robustness. We investigate training time attacks on graph neural networks for node classification t…
View article: Adversarial Attacks on Neural Networks for Graph Data
Adversarial Attacks on Neural Networks for Graph Data Open
Deep learning models for graphs have achieved strong performance for the task\nof node classification. Despite their proliferation, currently there is no\nstudy of their robustness to adversarial attacks. Yet, in domains where they\nare li…
View article: Adversarial Attacks on Classification Models for Graphs
Adversarial Attacks on Classification Models for Graphs Open
Deep learning models for graphs have achieved strong performance for the task of node classification. Despite their proliferation, currently there is no study of their robustness to adversarial attacks. Yet, in domains where they are likel…
View article: NetGAN: Generating Graphs via Random Walks
NetGAN: Generating Graphs via Random Walks Open
We propose NetGAN - the first implicit generative model for graphs able to mimic real-world networks. We pose the problem of graph generation as learning the distribution of biased random walks over the input graph. The proposed model is b…