Conditional mutual information
View article
Part mutual information for quantifying direct associations in networks Open
Significance Measuring direct associations between variables is of great importance in various areas of science, especially in the era of big data. Although mutual information and conditional mutual information are widely used in quantifyi…
View article
Learning to Explain: An Information-Theoretic Perspective on Model Interpretation Open
We introduce instancewise feature selection as a methodology for model interpretation. Our method is based on learning a function to extract a subset of features that are most informative for each given example. This feature selector is tr…
View article
Feature Selection via Mutual Information: New Theoretical Insights Open
Mutual information has been successfully adopted in filter feature-selection methods to assess both the relevancy of a subset of features in predicting the target variable and the redundancy with respect to other variables. However, existi…
View article
Formal Limitations on the Measurement of Mutual Information Open
Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutua…
View article
Classification of Dysarthric Speech According to the Severity of Impairment: an Analysis of Acoustic Features Open
The automatic speech recognition (ASR) system is increasingly being applied as assistive technology in the speech impaired community, for individuals with physical disabilities such as dysarthric speakers. However, the effectiveness of the…
View article
Multipartite Generalization of Quantum Discord Open
A generalization of quantum discord to multipartite systems is proposed. A key feature of our formulation is its consistency with the conventional definition of discord in bipartite systems. It is by construction zero only for systems with…
View article
Estimating Mutual Information for Discrete-Continuous Mixtures Open
Estimation of mutual information from observed samples is a basic primitive in machine learning, useful in several learning tasks including correlation mining, information bottleneck, Chow-Liu tree, and conditional independence testing in …
View article
Introducing a differentiable measure of pointwise shared information Open
Partial information decomposition of the multivariate mutual information describes the distinct ways in which a set of source variables contains information about a target variable. The groundbreaking work of Williams and Beer has shown th…
View article
Feature Selection with Conditional Mutual Information Considering Feature Interaction Open
Feature interaction is a newly proposed feature relevance relationship, but the unintentional removal of interactive features can result in poor classification performance for this relationship. However, traditional feature selection algor…
View article
Conditional mutual information of bipartite unitaries and scrambling Open
One way to diagnose chaos in bipartite unitary channels is via the tripartite\ninformation of the corresponding Choi state, which for certain choices of the\nsubsystems reduces to the negative conditional mutual information (CMI). We\nstud…
View article
Optimal short-term memory before the edge of chaos in driven random recurrent networks Open
The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory. The combination of a small input strength and mean-field assumptions makes it possible to de…
View article
Conditional uncertainty principle Open
We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define condi…
View article
A novel method of gene regulatory network structure inference from gene knock-out expression data Open
Inferring Gene Regulatory Networks (GRNs) structure from gene expression data has been a challenging problem in systems biology. It is critical to identify complicated regulatory relationships among genes for understanding regulatory mecha…
View article
ClusterMI: Detecting High-Order SNP Interactions Based on Clustering and Mutual Information Open
Identifying single nucleotide polymorphism (SNP) interactions is considered as a popular and crucial way for explaining the missing heritability of complex diseases in genome-wide association studies (GWAS). Many approaches have been propo…
View article
Modelling and Quantifying Membership Information Leakage in Machine Learning Open
Machine learning models have been shown to be vulnerable to membership inference attacks, i.e., inferring whether individuals' data have been used for training models. The lack of understanding about factors contributing success of these a…
View article
Conditional mutual information and quantum steering Open
Quantum steering has recently been formalized in the framework of a resource\ntheory of steering, and several quantifiers have already been introduced. Here,\nwe propose an information-theoretic quantifier for steering called intrinsic\nst…
View article
Model-Free Conditional Feature Screening with FDR Control Open
In this article, we propose a model-free conditional feature screening method with false discovery rate (FDR) control for ultra-high dimensional data. The proposed method is built upon a new measure of conditional independence. Thus, the n…
View article
Conditional Decoupling of Quantum Information Open
Insights from quantum information theory show that correlation measures based on quantum entropy are fundamental tools that reveal the entanglement structure of multipartite states. In that spirit, Groisman, Popescu, and Winter [Phys. Rev.…
View article
Inferring Gene Regulatory Networks Using Conditional Regulation Pattern to Guide Candidate Genes Open
Combining path consistency (PC) algorithms with conditional mutual information (CMI) are widely used in reconstruction of gene regulatory networks. CMI has many advantages over Pearson correlation coefficient in measuring non-linear depend…
View article
Error Exponents and α-Mutual Information Open
Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s E0 functions (with and without cost c…
View article
CCMI : Classifier based Conditional Mutual Information Estimation Open
Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference …
View article
Low Redundancy Feature Selection of Short Term Solar Irradiance Prediction Using Conditional Mutual Information and Gauss Process Regression Open
Solar irradiation is influenced by many meteorological features, which results in a complex structure meaning its prediction has low efficiency and accuracy. The existing prediction methods are focused on analyzing the correlation between …
View article
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information Open
Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entr…
View article
Reasoning About Generalization via Conditional Mutual Information Open
We provide an information-theoretic framework for studying the generalization properties of machine learning algorithms. Our framework ties together existing approaches, including uniform convergence bounds and recent methods for adaptive …
View article
Conditional Mutual Information Neural Estimator Open
Several recent works in communication systems have proposed to leverage the\npower of neural networks in the design of encoders and decoders. In this\napproach, these blocks can be tailored to maximize the transmission rate based\non aggre…
View article
Submodular Combinatorial Information Measures with Applications in Machine Learning Open
Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity since entropy over a …
View article
On a General Definition of Conditional Rényi Entropies Open
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Thus, Arimoto proposed a definition that found an application in information theory, Jizba and Arimitsu proposed a definition that found an a…
View article
Quantifying non-Markovianity via conditional mutual information Open
In this paper, we study measures of quantum non-Markovianity based on the conditional mutual information. We obtain such measures by considering multiple parts of the total environment such that the conditional mutual information can be de…
View article
Information-Theoretic Bias Reduction via Causal View of Spurious Correlation Open
We propose an information-theoretic bias measurement technique through a causal interpretation of spurious correlation, which is effective to identify the feature-level algorithmic bias by taking advantage of conditional mutual information…
View article
Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding Open
Transfer entropy from non-uniform embedding is a popular tool for the inference of causal relationships among dynamical subsystems. In this study we present an approach that makes use of low-dimensional conditional mutual information quant…