Deng-Bao Wang
YOU?
Author Swipe
View article: Investigating the confidence calibratability of deep neural networks Investigating the confidence calibratability of deep neural networks
Investigating the confidence calibratability of deep neural networks Investigating the confidence calibratability of deep neural networks Open
View article: Distilling Reliable Knowledge for Instance-Dependent Partial Label Learning
Distilling Reliable Knowledge for Instance-Dependent Partial Label Learning Open
Partial label learning (PLL) refers to the classification task where each training instance is ambiguously annotated with a set of candidate labels. Despite substantial advancements in tackling this challenge, limited attention has been de…
View article: Learning From Noisy Labels via Dynamic Loss Thresholding
Learning From Noisy Labels via Dynamic Loss Thresholding Open
Numerous researches have proved that deep neural networks (DNNs) can fit everything in the end even given data with noisy labels, and result in poor generalization performance. However, recent studies suggest that DNNs tend to gradually me…
View article: Partial-Label Regression
Partial-Label Regression Open
Partial-label learning is a popular weakly supervised learning setting that allows each training example to be annotated with a set of candidate labels. Previous studies on partial-label learning only focused on the classification setting …
View article: Partial label learning with emerging new labels
Partial label learning with emerging new labels Open
View article: Learning from Complementary Labels via Partial-Output Consistency Regularization
Learning from Complementary Labels via Partial-Output Consistency Regularization Open
In complementary-label learning (CLL), a multi-class classifier is learned from training instances each associated with complementary labels, which specify the classes that the instance does not belong to. Previous studies focus on unbiase…
View article: Learning from Noisy Labels with Complementary Loss Functions
Learning from Noisy Labels with Complementary Loss Functions Open
Recent researches reveal that deep neural networks are sensitive to label noises hence leading to poor generalization performance in some tasks. Although different robust loss functions have been proposed to remedy this issue, they suffer …
View article: Multi-View Multi-Label Learning with View-Specific Information Extraction
Multi-View Multi-Label Learning with View-Specific Information Extraction Open
Multi-view multi-label learning serves an important framework to learn from objects with diverse representations and rich semantics. Existing multi-view multi-label learning techniques focus on exploiting shared subspace for fusing multi-v…