Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
Prior Transformer-based models adopt various self-attention mechanisms to discover the long-range dependencies.