Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
BLOOM was released around the same time, and the largest model in the family has 176B parameters and is trained on 366B tokens in 46 languages and 13 programming languages.