Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
---
|
2 |
-
base_model: checkpoints/Mistral-7B-Instruct-EI-
|
3 |
datasets:
|
4 |
- synthetic_data_mistral-7b-instruct-expert-iteration-iter3_score
|
5 |
tags:
|
@@ -17,13 +17,13 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
# Mistral-7B-Instruct-EI-Iter3
|
19 |
|
20 |
-
This model is a GPTQ version of [checkpoints/Mistral-7B-Instruct-EI-
|
21 |
|
22 |
Created with [AutoQuant](https://colab.research.google.com/drive/1b6nqC7UZVt8bx4MksX7s656GXPM-eWw4?usp=sharing)
|
23 |
|
24 |
## Model description
|
25 |
|
26 |
-
|
27 |
|
28 |
## Intended uses & limitations
|
29 |
|
|
|
1 |
---
|
2 |
+
base_model: checkpoints/Mistral-7B-Instruct-EI-Iter3
|
3 |
datasets:
|
4 |
- synthetic_data_mistral-7b-instruct-expert-iteration-iter3_score
|
5 |
tags:
|
|
|
17 |
|
18 |
# Mistral-7B-Instruct-EI-Iter3
|
19 |
|
20 |
+
This model is a GPTQ version of [checkpoints/Mistral-7B-Instruct-EI-Iter3](https://huggingface.co/checkpoints/Mistral-7B-Instruct-EI-Iter3)
|
21 |
|
22 |
Created with [AutoQuant](https://colab.research.google.com/drive/1b6nqC7UZVt8bx4MksX7s656GXPM-eWw4?usp=sharing)
|
23 |
|
24 |
## Model description
|
25 |
|
26 |
+
I like the GPTQ format, this is 8bit, GROUP_SIZE 32.
|
27 |
|
28 |
## Intended uses & limitations
|
29 |
|