Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
However, BERT heavily relies on the global self-attention block and thus suffers
large memory footprint and computation cost.