totally-not-an-llm
commited on
Commit
·
64dd787
1
Parent(s):
8456a85
Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,8 @@ This model is an early test of the EverythingLM dataset and some new experimenta
|
|
15 |
### GGML quants:
|
16 |
https://huggingface.co/TheBloke/EverythingLM-13B-16K-GGML
|
17 |
|
18 |
-
|
|
|
19 |
### GPTQ quants:
|
20 |
https://huggingface.co/TheBloke/EverythingLM-13B-16K-GPTQ
|
21 |
|
|
|
15 |
### GGML quants:
|
16 |
https://huggingface.co/TheBloke/EverythingLM-13B-16K-GGML
|
17 |
|
18 |
+
Make sure to use correct rope scaling settings:
|
19 |
+
`-c 16384 --rope-freq-base 10000 --rope-freq-scale 0.25`
|
20 |
### GPTQ quants:
|
21 |
https://huggingface.co/TheBloke/EverythingLM-13B-16K-GPTQ
|
22 |
|