feihu.hf commited on
Commit
09b3178
1 Parent(s): bce2c65

update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -79,7 +79,13 @@ To handle extensive inputs exceeding 32,768 tokens, we utilize [YARN](https://ar
79
 
80
  For deployment, we recommend using vLLM. You can enable the long-context capabilities by following these steps:
81
 
82
- 1. **Install vLLM**: Ensure you have the latest version from the main branch of [vLLM](https://github.com/vllm-project/vllm).
 
 
 
 
 
 
83
 
84
  2. **Configure Model Settings**: After downloading the model weights, modify the `config.json` file by including the below snippet:
85
  ```json
 
79
 
80
  For deployment, we recommend using vLLM. You can enable the long-context capabilities by following these steps:
81
 
82
+ 1. **Install vLLM**: You can install vLLM by running the following command.
83
+
84
+ ```bash
85
+ pip install "vllm>=0.4.3"
86
+ ```
87
+
88
+ Or you can install vLLM from [source](https://github.com/vllm-project/vllm/).
89
 
90
  2. **Configure Model Settings**: After downloading the model weights, modify the `config.json` file by including the below snippet:
91
  ```json