Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
Moreover, we show that with sufficient model size, cross-lingual pretraining can outperform
English-only pretraining when translating English speech into other languages, a setting which favors monolingual
pretraining.