Emphasis required on "these weights cannot be used by themselves"
#1
by
TearGosling
- opened
README.md
CHANGED
@@ -21,7 +21,7 @@ It was trained by doing supervised fine-tuning over a mixture of regular instruc
|
|
21 |
|
22 |
## Applying the XORs
|
23 |
|
24 |
-
The model weights in this repository cannot be used as-is
|
25 |
|
26 |
- Request access to the original LLaMA weights from Meta [through this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form)
|
27 |
- Convert them to the HuggingFace Transformers format by using the [convert_llama_weights_to_hf.py](https://github.com/huggingface/transformers/blob/849367ccf741d8c58aa88ccfe1d52d8636eaf2b7/src/transformers/models/llama/convert_llama_weights_to_hf.py) script **for your version of the `transformers` library**
|
|
|
21 |
|
22 |
## Applying the XORs
|
23 |
|
24 |
+
**The model weights in this repository cannot be used as-is.** The files here are XORs due to licensing concerns. To obtain proper, usable model weights you need to:
|
25 |
|
26 |
- Request access to the original LLaMA weights from Meta [through this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form)
|
27 |
- Convert them to the HuggingFace Transformers format by using the [convert_llama_weights_to_hf.py](https://github.com/huggingface/transformers/blob/849367ccf741d8c58aa88ccfe1d52d8636eaf2b7/src/transformers/models/llama/convert_llama_weights_to_hf.py) script **for your version of the `transformers` library**
|