File size: 111 Bytes
5fa1a76
1
Prior Transformer-based models adopt various self-attention mechanisms to discover the long-range dependencies.