论文笔记-constrast learning in NLP
paper list:
- An efficient framework for learning sentence representations.
- CLEAR: Contrastive Learning for Sentence Representation
- Declutr: Deep contrastive learn- ing for unsupervised textual representations. arXiv
- SimCSE: Simple Contrastive Learning of Sentence Embeddings
- R-Drop: R-Drop: Regularized Dropout for Neural Networks
- Coco-lm: Correcting and contrasting text sequences for language model pretraining.
- Learning dense representations of phrases at scale.
- An unsupervised sentence embedding method by mutual information maximization.
- Representation degeneration problem in training natural language generation models
- CosReg: Improving neural language generation with spectrum control.
- CLAPS: Contrastive Learning with Adversarial Perturbations for Conditional Text Generation
- LINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding
- Adversarial PerturbationInterpretable Adversarial Perturbation in Input Embedding Space for Text
- Contrastive Learning for Many-to-many Multilingual Neural Machine Translation
- Unsupervised Data Augmentation for Consistency Training