File size: 169 Bytes
5fa1a76
 
1
2
Longformer's attention mechanism is a drop-in replacement for the standard self-attention and combines a local
windowed attention with a task motivated global attention.