Text Generation
Transformers
Safetensors
Czech
mpt
custom_code
text-generation-inference
Inference Endpoints
mfajcik commited on
Commit
6f6b819
1 Parent(s): 941acb1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -57,8 +57,7 @@ tbd.
57
  pip install transformers==4.37.2 torch==2.1.2 einops==0.7.0
58
 
59
  # be sure to install right flash-attn, we use torch compiled with CUDA 12.1, no ABI, python 3.9, Linux x86_64 architecture
60
- pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.3/flash_attn-2.5.3+cu122torch2.
61
- 1cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
62
  ```
63
 
64
  ## Running the Code
 
57
  pip install transformers==4.37.2 torch==2.1.2 einops==0.7.0
58
 
59
  # be sure to install right flash-attn, we use torch compiled with CUDA 12.1, no ABI, python 3.9, Linux x86_64 architecture
60
+ pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.3/flash_attn-2.5.3+cu122torch2.1cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
 
61
  ```
62
 
63
  ## Running the Code