[论文笔记]BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

  1. Pretraining

2. Noise transformations

3. Finetuning

paper

2+