Faming Liang
YOU?
Author Swipe
View article: Deep Survival Analysis for Competing Risk Modeling with Functional Covariates and Missing Data Imputation
Deep Survival Analysis for Competing Risk Modeling with Functional Covariates and Missing Data Imputation Open
We introduce the Functional Competing Risk Net (FCRN), a unified deep-learning framework for discrete-time survival analysis under competing risks, which seamlessly integrates functional covariates and handles missing data within an end-to…
View article: Uncertainty Quantification for Large-Scale Deep Neural Networks via Post-StoNet Modeling
Uncertainty Quantification for Large-Scale Deep Neural Networks via Post-StoNet Modeling Open
View article: Extended fiducial inference for individual treatment effects via deep neural networks
Extended fiducial inference for individual treatment effects via deep neural networks Open
Individual treatment effect estimation has gained significant attention in recent data science literature. This work introduces the Double Neural Network (Double-NN) method to address this problem within the framework of extended fiducial …
View article: Extended Fiducial Inference for Individual Treatment Effects via Deep Neural Networks
Extended Fiducial Inference for Individual Treatment Effects via Deep Neural Networks Open
Individual treatment effect estimation has gained significant attention in recent data science literature. This work introduces the Double Neural Network (Double-NN) method to address this problem within the framework of extended fiducial …
View article: Magnitude Pruning of Large Pretrained Transformer Models with a Mixture Gaussian Prior
Magnitude Pruning of Large Pretrained Transformer Models with a Mixture Gaussian Prior Open
Large pretrained transformer models have revolutionized modern AI applications with their state-of-the-art performance in natural language processing (NLP). However, their substantial parameter count poses challenges for real-world deploym…
View article: Extended Fiducial Inference: Toward an Automated Process of Statistical Inference
Extended Fiducial Inference: Toward an Automated Process of Statistical Inference Open
While fiducial inference was widely considered a big blunder by R.A. Fisher, the goal he initially set --`inferring the uncertainty of model parameters on the basis of observations' -- has been continually pursued by many statisticians. To…
View article: Causal-StoNet: Causal Inference for High-Dimensional Complex Data
Causal-StoNet: Causal Inference for High-Dimensional Complex Data Open
With the advancement of data science, the collection of increasingly complex datasets has become commonplace. In such datasets, the data dimension can be extremely high, and the underlying data generation process can be unknown and highly …
View article: Fast Value Tracking for Deep Reinforcement Learning
Fast Value Tracking for Deep Reinforcement Learning Open
Reinforcement learning (RL) tackles sequential decision-making problems by creating agents that interacts with their environment. However, existing algorithms often view these problem as static, focusing on point estimates for model parame…
View article: Magnitude Pruning of Large Pretrained Transformer Models with a Mixture Gaussian Prior
Magnitude Pruning of Large Pretrained Transformer Models with a Mixture Gaussian Prior Open
Large pretrained transformer models have revolutionized modern AI applications with their state-of-the-art performance in natural language processing (NLP). However, their substantial parameter count poses challenges for real-world deploym…
View article: A double regression method for graphical modeling of high-dimensional nonlinear and non-Gaussian data
A double regression method for graphical modeling of high-dimensional nonlinear and non-Gaussian data Open
Graphical models have long been studied in statistics as a tool for inferring conditional independence relationships among a large set of random variables. The most existing works in graphical modeling focus on the cases that the data are …
View article: Sparse Deep Learning for Time Series Data: Theory and Applications
Sparse Deep Learning for Time Series Data: Theory and Applications Open
Sparse deep learning has become a popular technique for improving the performance of deep neural networks in areas such as uncertainty quantification, variable selection, and large-scale network compression. However, most existing research…
View article: Non-reversible Parallel Tempering for Deep Posterior Approximation
Non-reversible Parallel Tempering for Deep Posterior Approximation Open
Parallel tempering (PT), also known as replica exchange, is the go-to workhorse for simulations of multi-modal distributions. The key to the success of PT is to adopt efficient swap schemes. The popular deterministic even-odd (DEO) scheme …
View article: A New Paradigm for Generative Adversarial Networks based on Randomized Decision Rules
A New Paradigm for Generative Adversarial Networks based on Randomized Decision Rules Open
The Generative Adversarial Network (GAN) was recently introduced in the literature as a novel machine learning method for training generative models. It has many applications in statistics such as nonparametric clustering and nonparametric…
View article: A New Paradigm for Generative Adversarial Networks Based on Randomized Decision Rules
A New Paradigm for Generative Adversarial Networks Based on Randomized Decision Rules Open
The Generative Adversarial Network (GAN) was recently introduced in the literature as a novel machine learning method for training generative models. It has many applications in statistics such as nonparametric clustering and nonparametric…
View article: Bayesian Analysis of Exponential Random Graph Models Using Stochastic Gradient Markov Chain Monte Carlo
Bayesian Analysis of Exponential Random Graph Models Using Stochastic Gradient Markov Chain Monte Carlo Open
The exponential random graph model (ERGM) is a popular model for social networks, which is known to have an intractable likelihood function. Sampling from the posterior for such a model is a long-standing problem in statistical research. W…
View article: A Double Regression Method for Graphical Modeling of High-dimensional Nonlinear and Non-Gaussian Data
A Double Regression Method for Graphical Modeling of High-dimensional Nonlinear and Non-Gaussian Data Open
Graphical models have long been studied in statistics as a tool for inferring conditional independence relationships among a large set of random variables. The most existing works in graphical modeling focus on the cases that the data are …
View article: A Langevinized Ensemble Kalman Filter for Large-Scale Dynamic Learning
A Langevinized Ensemble Kalman Filter for Large-Scale Dynamic Learning Open
The Ensemble Kalman Filter (EnKF) has achieved great successes in data assimilation in atmospheric and oceanic sciences, but it fails in converging to the correct filtering distribution which precludes its use for uncertainty quantificatio…
View article: Non-reversible Parallel Tempering for Deep Posterior Approximation
Non-reversible Parallel Tempering for Deep Posterior Approximation Open
Parallel tempering (PT), also known as replica exchange, is the go-to workhorse for simulations of multi-modal distributions. The key to the success of PT is to adopt efficient swap schemes. The popular deterministic even-odd (DEO) scheme …
View article: Nonlinear Sufficient Dimension Reduction with a Stochastic Neural Network
Nonlinear Sufficient Dimension Reduction with a Stochastic Neural Network Open
Sufficient dimension reduction is a powerful tool to extract core information hidden in the high-dimensional data and has potentially many important applications in machine learning tasks. However, the existing nonlinear sufficient dimensi…
View article: A Stochastic Approximation-Langevinized Ensemble Kalman Filter Algorithm for State Space Models with Unknown Parameters
A Stochastic Approximation-Langevinized Ensemble Kalman Filter Algorithm for State Space Models with Unknown Parameters Open
Inference for high-dimensional, large scale and long series dynamic systems is a challenging task in modern data science. The existing algorithms, such as particle filter or sequential importance sampler, do not scale well to the dimension…
View article: Markov neighborhood regression for statistical inference of high‐dimensional generalized linear models
Markov neighborhood regression for statistical inference of high‐dimensional generalized linear models Open
High‐dimensional inference is one of fundamental problems in modern biomedical studies. However, the existing methods do not perform satisfactorily. Based on the Markov property of graphical models and the likelihood ratio test, this artic…
View article: Interacting Contour Stochastic Gradient Langevin Dynamics
Interacting Contour Stochastic Gradient Langevin Dynamics Open
We propose an interacting contour stochastic gradient Langevin dynamics (ICSGLD) sampler, an embarrassingly parallel multiple-chain contour stochastic gradient Langevin dynamics (CSGLD) sampler with efficient interactions. We show that ICS…
View article: A Kernel-Expanded Stochastic Neural Network
A Kernel-Expanded Stochastic Neural Network Open
The deep neural network suffers from many fundamental issues in machine learning. For example, it often gets trapped into a local minimum in training, and its prediction uncertainty is hard to be assessed. To address these issues, we propo…
View article: A Stochastic Approximation-Langevinized Ensemble Kalman Filter Algorithm for State Space Models with Unknown Parameters
A Stochastic Approximation-Langevinized Ensemble Kalman Filter Algorithm for State Space Models with Unknown Parameters Open
Inference for high-dimensional, large scale and long series dynamic systems is a challenging task in modern data science. The existing algorithms, such as particle filter or sequential importance sampler, do not scale well to the dimension…
View article: PURE: A Framework for Analyzing Proximity-based Contact Tracing Protocols
PURE: A Framework for Analyzing Proximity-based Contact Tracing Protocols Open
Many proximity-based tracing (PCT) protocols have been proposed and deployed to combat the spreading of COVID-19. In this article, we take a systematic approach to analyze PCT protocols. We identify a list of desired properties of a contac…
View article: Best-Worst Method: Inconsistency, Uncertainty, Consensus, and Range Sensitivity
Best-Worst Method: Inconsistency, Uncertainty, Consensus, and Range Sensitivity Open
It is our choices that make us who we are. To lead a better life, we have to make better decisions. Nowadays, decisions are increasingly made in complex contexts, in a host of different application domains. Because of that, we need more re…
View article: Sparse Deep Learning: A New Framework Immune to Local Traps and Miscalibration
Sparse Deep Learning: A New Framework Immune to Local Traps and Miscalibration Open
Deep learning has powered recent successes of artificial intelligence (AI). However, the deep neural network, as the basic model of deep learning, has suffered from issues such as local traps and miscalibration. In this paper, we provide a…
View article: Sparse Deep Learning: A New Framework Immune to Local Traps and\n Miscalibration
Sparse Deep Learning: A New Framework Immune to Local Traps and\n Miscalibration Open
Deep learning has powered recent successes of artificial intelligence (AI).\nHowever, the deep neural network, as the basic model of deep learning, has\nsuffered from issues such as local traps and miscalibration. In this paper, we\nprovid…
View article: Learning sparse deep neural networks with a spike-and-slab prior
Learning sparse deep neural networks with a spike-and-slab prior Open
View article: Consistent Sparse Deep Learning: Theory and Computation
Consistent Sparse Deep Learning: Theory and Computation Open
Deep learning has been the engine powering many successes of data science.However, the deep neural network (DNN), as the basic model of deep learning, is often excessively over parameterized, causing many difficulties in training, predicti…