blanchon commited on
Commit
8084da7
·
1 Parent(s): 5888f20

update deps

Browse files
Files changed (1) hide show
  1. requirements.txt +2 -1
requirements.txt CHANGED
@@ -4,7 +4,8 @@ diffusers
4
  transformers
5
  accelerate
6
  xformers
7
- flash-attn
8
  einops
9
  gradio
10
  spaces
 
 
4
  transformers
5
  accelerate
6
  xformers
7
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post2/flash_attn-2.7.0.post2+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
8
  einops
9
  gradio
10
  spaces
11
+