Antoine Ledent
YOU?
Author Swipe
View article: Conv4Rec: A 1-by-1 Convolutional AutoEncoder for User Profiling through Joint Analysis of Implicit and Explicit Feedbacks
Conv4Rec: A 1-by-1 Convolutional AutoEncoder for User Profiling through Joint Analysis of Implicit and Explicit Feedbacks Open
We introduce a new convolutional AutoEncoder architecture for user modelling and recommendation tasks with several improvements over the state of the art. Firstly, our model has the flexibility to learn a set of associations and combinatio…
View article: Generalization Analysis for Supervised Contrastive Representation Learning under Non-IID Settings
Generalization Analysis for Supervised Contrastive Representation Learning under Non-IID Settings Open
Contrastive Representation Learning (CRL) has achieved impressive success in various domains in recent years. Nevertheless, the theoretical understanding of the generalization behavior of CRL has remained limited. Moreover, to the best of …
View article: Explainable Neural Networks with Guarantee: A Sparse Estimation Approach
Explainable Neural Networks with Guarantee: A Sparse Estimation Approach Open
Balancing predictive power and interpretability has long been a challenging research area, particularly in powerful yet complex models like neural networks, where nonlinearity obstructs direct interpretation. This paper introduces a novel …
View article: Generalization Analysis for Deep Contrastive Representation Learning
Generalization Analysis for Deep Contrastive Representation Learning Open
In this paper, we present generalization bounds for the unsupervised risk in the Deep Contrastive Representation Learning framework, which employs deep neural networks as representation functions. We approach this problem from two angles. …
View article: Explainable Neural Networks with Guarantees: A Sparse Estimation Approach
Explainable Neural Networks with Guarantees: A Sparse Estimation Approach Open
Balancing predictive power and interpretability has long been a challenging research area, particularly in powerful yet complex models like neural networks, where nonlinearity obstructs direct interpretation. This paper introduces a novel …
View article: Generalization Analysis for Deep Contrastive Representation Learning
Generalization Analysis for Deep Contrastive Representation Learning Open
In this paper, we present generalization bounds for the unsupervised risk in the Deep Contrastive Representation Learning framework, which employs deep neural networks as representation functions. We approach this problem from two angles. …
View article: Interpretable Tensor Fusion
Interpretable Tensor Fusion Open
Conventional machine learning methods are predominantly designed to predict outcomes based on a single data type. However, practical applications may encompass data of diverse types, such as text, images, and audio. We introduce interpreta…
View article: Generalization Bounds for Inductive Matrix Completion in Low-Noise Settings
Generalization Bounds for Inductive Matrix Completion in Low-Noise Settings Open
We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds wit…
View article: Generalization Bounds for Inductive Matrix Completion in Low-noise Settings
Generalization Bounds for Inductive Matrix Completion in Low-noise Settings Open
We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds wit…
View article: Beyond Smoothness: Incorporating Low-Rank Analysis into Nonparametric\n Density Estimation
Beyond Smoothness: Incorporating Low-Rank Analysis into Nonparametric\n Density Estimation Open
The construction and theoretical analysis of the most popular universally\nconsistent nonparametric density estimators hinge on one functional property:\nsmoothness. In this paper we investigate the theoretical implications of\nincorporati…
View article: Beyond Smoothness: Incorporating Low-Rank Analysis into Nonparametric Density Estimation
Beyond Smoothness: Incorporating Low-Rank Analysis into Nonparametric Density Estimation Open
The construction and theoretical analysis of the most popular universally consistent nonparametric density estimators hinge on one functional property: smoothness. In this paper we investigate the theoretical implications of incorporating …
View article: Orthogonal Inductive Matrix Completion
Orthogonal Inductive Matrix Completion Open
We propose orthogonal inductive matrix completion (OMIC), an interpretable approach to matrix completion based on a sum of multiple orthonormal side information terms, together with nuclear-norm regularization. The approach allows us to in…
View article: Learning Interpretable Concept Groups in CNNs
Learning Interpretable Concept Groups in CNNs Open
We propose a novel training methodology---Concept Group Learning (CGL)---that encourages training of interpretable CNN filters by partitioning filters in each layer into \emph{concept groups}, each of which is trained to learn a single vis…
View article: Fine-grained analysis of structured output prediction
Fine-grained analysis of structured output prediction Open
MK, AL and WM acknowledge support by the German Research Foundation (DFG) award KL 2698/2-1 and by theGerman Federal Ministry of Science and Education (BMBF)awards 01IS18051A, 031B0770E, and 01MK20014U. YLacknowledges support by NSFC under…
View article: Fine-grained Generalization Analysis of Structured Output Prediction
Fine-grained Generalization Analysis of Structured Output Prediction Open
In machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processin…
View article: Fine-grained Generalization Analysis of Vector-Valued Learning
Fine-grained Generalization Analysis of Vector-Valued Learning Open
Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific…
View article: Norm-Based Generalisation Bounds for Deep Multi-Class Convolutional Neural Networks
Norm-Based Generalisation Bounds for Deep Multi-Class Convolutional Neural Networks Open
We show generalisation error bounds for deep learning with two main improvements over the state of the art. (1) Our bounds have no explicit dependence on the number of classes except for logarithmic factors. This holds even when formulatin…
View article: Model Uncertainty Guides Visual Object Tracking
Model Uncertainty Guides Visual Object Tracking Open
Model object trackers largely rely on the online learning of a discriminative classifier from potentially diverse sample frames. However, noisy or insufficient amounts of samples can deteriorate the classifiers' performance and cause track…
View article: Fine-grained Generalization Analysis of Vector-valued Learning
Fine-grained Generalization Analysis of Vector-valued Learning Open
Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific…
View article: Norm-based generalisation bounds for multi-class convolutional neural networks
Norm-based generalisation bounds for multi-class convolutional neural networks Open
We show generalisation error bounds for deep learning with two main improvements over the state of the art. (1) Our bounds have no explicit dependence on the number of classes except for logarithmic factors. This holds even when formulatin…