Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
Oftentimes, encoder models are pretrained using techniques like masked language modeling, which masks parts of the input sequence and forces the model to create more meaningful representations.