updated to use huggingface-cli
Browse files
README.md
CHANGED
@@ -56,12 +56,14 @@ If you want to run a quick test or if the exact model you want to use is [Huggin
|
|
56 |
|
57 |
First, you will need a local copy of the library. This is because one of the nice things that the Hugging Face optimum library does is abstract local loads from repository loads. However, Mistral inference isn't supported yet.
|
58 |
|
59 |
-
From python:
|
60 |
|
61 |
```
|
62 |
-
#
|
63 |
-
|
64 |
-
|
|
|
|
|
|
|
65 |
|
66 |
```
|
67 |
|
|
|
56 |
|
57 |
First, you will need a local copy of the library. This is because one of the nice things that the Hugging Face optimum library does is abstract local loads from repository loads. However, Mistral inference isn't supported yet.
|
58 |
|
|
|
59 |
|
60 |
```
|
61 |
+
# To speed up downloads we can use hf_transfer
|
62 |
+
pip install hf_transfer
|
63 |
+
HF_HUB_ENABLE_HF_TRANSFER=1
|
64 |
+
|
65 |
+
# use huggingface-cli to download model to local dir
|
66 |
+
huggingface-cli download ritikk/zephyr-7b-beta-neuron --local-dir zephyr-7b-beta-neuron
|
67 |
|
68 |
```
|
69 |
|