BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Paper
β’
1910.13461
β’
Published
β’
6
docs: https://huggingface.co/transformers/model_doc/bart.html
finetuning: examples/seq2seq/ (as of Aug 20, 2020)
Metrics: ROUGE > 22 on xsum.
variants: search for distilbart