Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
This is required because the Flash Attention only support fp16 and bf16 data type.