Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
However, recent works show the attention-based module in transformers can be replaced by spatial MLPs and the resulted models still perform quite well.