Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
The pretraining task involves randomly shuffling the order of the original sentences and a novel in-filling scheme,
where spans of text are replaced with a single mask token.