Nihat Ay
YOU?
Author Swipe
View article: Efficient optical coating design using an autoencoder-based neural network model
Efficient optical coating design using an autoencoder-based neural network model Open
Optical thin-film coatings are integral to modern photonics and in particular to ultrafast lasers, providing precise control of dispersion and reflectivity, thus enabling tailored pulse shaping. Designing these coatings represents an inver…
View article: Generalizing thermodynamic efficiency of interactions: inferential, information-geometric and computational perspectives
Generalizing thermodynamic efficiency of interactions: inferential, information-geometric and computational perspectives Open
Self-organizing systems consume energy to generate internal order. The concept of thermodynamic efficiency, drawing from statistical physics and information theory, has previously been proposed to characterize a change in control parameter…
View article: Wasserstein KL-divergence for Gaussian distributions
Wasserstein KL-divergence for Gaussian distributions Open
We introduce a new version of the KL-divergence for Gaussian distributions which is based on Wasserstein geometry and referred to as WKL-divergence. We show that this version is consistent with the geometry of the sample space ${\Bbb R}^n$…
View article: Torsion of $α$-connections on the density manifold
Torsion of $α$-connections on the density manifold Open
We study the torsion of the $α$-connections defined on the density manifold in terms of a regular Riemannian metric. In the case of the Fisher-Rao metric our results confirm the fact that all $α$-connections are torsion free. For the $α$-c…
View article: An Information-Theoretic Perspective on Acting Agents
An Information-Theoretic Perspective on Acting Agents Open
Every embodied intelligent agent constantly interacts in its own way with its environment. It perceives information about the world, processes this information in its controller, e.g. its brain, and sends signals to the body, which then in…
View article: Information geometry of the Otto metric
Information geometry of the Otto metric Open
We introduce the dual of the mixture connection with respect to the Otto metric which represents a new kind of exponential connection. This provides a dual structure consisting of the mixture connection, the Otto metric as a Riemannian met…
View article: Analyzing Multimodal Integration in the Variational Autoencoder from an Information-Theoretic Perspective
Analyzing Multimodal Integration in the Variational Autoencoder from an Information-Theoretic Perspective Open
Human perception is inherently multimodal. We integrate, for instance, visual, proprioceptive and tactile information into one experience. Hence, multimodal learning is of importance for building robotic systems that aim at robustly intera…
View article: A Concise Mathematical Description of Active Inference in Discrete Time
A Concise Mathematical Description of Active Inference in Discrete Time Open
In this paper we present a concise mathematical description of active inference in discrete time. The main part of the paper serves as a basic introduction to the topic, including a detailed example of the action selection mechanism. The a…
View article: Neural Network Method for Dielectric Optical Coating Design
Neural Network Method for Dielectric Optical Coating Design Open
We use neural networks to address the challenge of deriving dielectric coating designs from the desired optical properties. We show that our trained neural network can automatically design common laser mirror coatings types efficiently.
View article: Outsourcing Control Requires Control Complexity
Outsourcing Control Requires Control Complexity Open
An embodied agent influences its environment and is influenced by it. We use the sensorimotor loop to model these interactions and quantify the information flows in the system by information-theoretic measures. This includes a measure for …
View article: Inversion of Bayesian networks
Inversion of Bayesian networks Open
Variational autoencoders and Helmholtz machines use a recognition network (encoder) to approximate the posterior distribution of a generative model (decoder). In this paper we establish some necessary and some sufficient properties of a re…
View article: On the Natural Gradient of the Evidence Lower Bound
On the Natural Gradient of the Evidence Lower Bound Open
This article studies the Fisher-Rao gradient, also referred to as the natural gradient, of the evidence lower bound (ELBO) which plays a central role in generative machine learning. It reveals that the gap between the evidence and its lowe…
View article: Inversion of Bayesian Networks
Inversion of Bayesian Networks Open
Variational autoencoders and Helmholtz machines use a recognition network (encoder) to approximate the posterior distribution of a generative model (decoder). In this paper we study the necessary and sufficient properties of a recognition …
View article: Outsourcing Control requires Control Complexity
Outsourcing Control requires Control Complexity Open
An embodied agent constantly influences its environment and is influenced by it. We use the sensorimotor loop to model these interactions and thereby we can quantify different information flows in the system by various information theoreti…
View article: How Morphological Computation Shapes Integrated Information in Embodied Agents
How Morphological Computation Shapes Integrated Information in Embodied Agents Open
The Integrated Information Theory provides a quantitative approach to consciousness and can be applied to neural networks. An embodied agent controlled by such a network influences and is being influenced by its environment. This involves,…
View article: How Morphological Computation shapes Integrated Information in Embodied\n Agents
How Morphological Computation shapes Integrated Information in Embodied\n Agents Open
The Integrated Information Theory provides a quantitative approach to\nconsciousness and can be applied to neural networks. An embodied agent\ncontrolled by such a network influences and is being influenced by its\nenvironment. This involv…
View article: Preface
Preface Open
ISSN:2305-2228
View article: Confounding Ghost Channels and Causality: A New Approach to Causal Information Flows
Confounding Ghost Channels and Causality: A New Approach to Causal Information Flows Open
Information theory provides a fundamental framework for the quantification of information flows through channels, formally Markov kernels. However, quantities such as mutual information and conditional mutual information do not necessarily…
View article: Information Geometry & Complexity Science
Information Geometry & Complexity Science Open
In the first part of my lecture, I will review information-geometric structures and highlight the important role of divergences. I will present a novel approach to canonical divergences which extends the classical definition and recovers, …
View article: Complexity as causal information integration
Complexity as causal information integration Open
Complexity measures in the context of the Integrated Information Theory of consciousness, developed mainly by Tononi [7], try to asses the strength of the causal connections between different neurons. This is done by minimizing the Kullbac…
View article: Information decomposition based on cooperative game theory
Information decomposition based on cooperative game theory Open
We offer a new approach to the information decomposition problem in information theory: given a 'target' random variable co-distributed with multiple 'source' variables, how can we decompose the mutual information into a sum of non-negativ…
View article: Ingredients for robustness
Ingredients for robustness Open
A core property of robust systems is given by the invariance of their function against the removal of some of their structural components. This intuition has been formalised in the context of input–output maps, thereby introducing the noti…
View article: On the locality of the natural gradient for learning in deep Bayesian networks
On the locality of the natural gradient for learning in deep Bayesian networks Open
We study the natural gradient method for learning in deep Bayesian networks, including neural networks. There are two natural geometries associated with such learning systems consisting of visible and hidden units. One geometry is related …
View article: Complexity as causal information integration
Complexity as causal information integration Open
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system a…