论文笔记-constrast learning in NLP

paper list:

  • An efficient framework for learning sentence representations.
  • CLEAR: Contrastive Learning for Sentence Representation
  • Declutr: Deep contrastive learn- ing for unsupervised textual representations. arXiv
  • SimCSE: Simple Contrastive Learning of Sentence Embeddings
  • R-Drop: R-Drop: Regularized Dropout for Neural Networks
  • Coco-lm: Correcting and contrasting text sequences for language model pretraining.
  • Learning dense representations of phrases at scale.
  • An unsupervised sentence embedding method by mutual information maximization.
  • Representation degeneration problem in training natural language generation models
  • CosReg: Improving neural language generation with spectrum control.
  • CLAPS: Contrastive Learning with Adversarial Perturbations for Conditional Text Generation
  • LINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding
  • Adversarial PerturbationInterpretable Adversarial Perturbation in Input Embedding Space for Text
  • Contrastive Learning for Many-to-many Multilingual Neural Machine Translation
  • Unsupervised Data Augmentation for Consistency Training
阅读更多

论文笔记-image-based contrastive learning

  • simCLR
  • MoCo
  • BYOL
  • Swin-ssl
  • BraVe
  • What Makes for Good Views for Contrastive Learning?
  • BYOL works even without batch statistics
  • Understanding Self-Supervised Learning Dynamics without Contrastive Pairs
  • Big Self-Supervised Models are Strong Semi-Supervised Learners
  • Understanding contrastive representation learning through alignment and uniformity on the hypersphere.
阅读更多