Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
raw
history blame contribute delete
210 Bytes
Meanwhile, it also integrates a spatial-aware self-attention
mechanism into the Transformer architecture, so that the model can fully understand the relative positional
relationship among different text blocks.