Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
raw
history blame contribute delete
117 Bytes
Its architecture is identical to ProhpetNet, but the model was trained on the multi-lingual
"wiki100" Wikipedia dump.