Skip to main content

BERT and the Pretrain-Finetune Paradigm

3 selectedDifficulty 2-43 unseenView topic
FoundationNew
0 answered
2 foundation1 intermediateAdapts to your performance
Question 1 of 3
120sfoundation (2/10)conceptual
The pretrain-then-finetune paradigm popularized by BERT (2018) has become standard for NLP. What is the key advantage over training from scratch on each task?