Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
The
resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly
outperforms monolingual pretraining.