arXiv (Cornell University)
Improving Label Error Detection and Elimination with Uncertainty Quantification
May 2024 • Johannes Jakubik, Michael Vössing, Manil Maskey, Christopher Wölfle, Gerhard Satzger
Identifying and handling label errors can significantly enhance the accuracy of supervised machine learning models. Recent approaches for identifying label errors demonstrate that a low self-confidence of models with respect to a certain label represents a good indicator of an erroneous label. However, latest work has built on softmax probabilities to measure self-confidence. In this paper, we argue that -- as softmax probabilities do not reflect a model's predictive uncertainty accurately -- label error detection…