Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
This means, for example,
Perceiver IO can do BERT-style masked language modeling directly using bytes instead of tokenized inputs.