BAAI
/

BoyaWu10 commited on
Commit
44fff10
1 Parent(s): 415e3c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -35,6 +35,8 @@ pip install torch transformers accelerate pillow
35
  ```
36
  If the CUDA memory is enough, it would be faster to execute this snippet by setting `CUDA_VISIBLE_DEVICES=0`.
37
 
 
 
38
  ```python
39
  import torch
40
  import transformers
 
35
  ```
36
  If the CUDA memory is enough, it would be faster to execute this snippet by setting `CUDA_VISIBLE_DEVICES=0`.
37
 
38
+ Users especially those in Chinese mainland may want to refer to a HuggingFace [mirror site](https://hf-mirror.com).
39
+
40
  ```python
41
  import torch
42
  import transformers