Improved Baselines with Momentum Contrastive Learning Article Swipe
Related Concepts
computer science
projection (relational algebra)
contrast (vision)
momentum (technical analysis)
code (set theory)
artificial intelligence
machine learning
simple (philosophy)
natural language processing
algorithm
programming language
economics
set (abstract data type)
epistemology
philosophy
finance
Xinlei Chen
,
Haoqi Fan
,
Ross Girshick
,
Kaiming He
·
YOU?
·
· 2020
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2003.04297
· OA: W3009561768
YOU?
·
· 2020
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2003.04297
· OA: W3009561768
Contrastive unsupervised learning has recently shown encouraging progress, e.g., in Momentum Contrast (MoCo) and SimCLR. In this note, we verify the effectiveness of two of SimCLR's design improvements by implementing them in the MoCo framework. With simple modifications to MoCo---namely, using an MLP projection head and more data augmentation---we establish stronger baselines that outperform SimCLR and do not require large training batches. We hope this will make state-of-the-art unsupervised learning research more accessible. Code will be made public.
Related Topics
Finding more related topics…