bartowski commited on
Commit
7c7c221
1 Parent(s): e035576

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -27
README.md CHANGED
@@ -2,13 +2,14 @@
2
  license: cc-by-nc-nd-3.0
3
  quantized_by: bartowski
4
  pipeline_tag: text-generation
 
5
  ---
6
 
7
- ## Llamacpp imatrix Quantizations of SFR-Iterative-DPO-LLaMA-3-8B-R
8
 
9
  Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2854">b2854</a> for quantization.
10
 
11
- Original model: https://huggingface.co/Salesforce/SFR-Iterative-DPO-LLaMA-3-8B-R
12
 
13
  All quants made using imatrix option with dataset from [here](https://gist.github.com/bartowski1182/b6ac44691e994344625687afe3263b3a)
14
 
@@ -27,28 +28,28 @@ All quants made using imatrix option with dataset from [here](https://gist.githu
27
 
28
  | Filename | Quant type | File Size | Description |
29
  | -------- | ---------- | --------- | ----------- |
30
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0.gguf) | Q8_0 | 8.54GB | Extremely high quality, generally unneeded but max available quant. |
31
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q6_K.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q6_K.gguf) | Q6_K | 6.59GB | Very high quality, near perfect, *recommended*. |
32
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q5_K_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q5_K_M.gguf) | Q5_K_M | 5.73GB | High quality, *recommended*. |
33
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q5_K_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q5_K_S.gguf) | Q5_K_S | 5.59GB | High quality, *recommended*. |
34
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q4_K_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q4_K_M.gguf) | Q4_K_M | 4.92GB | Good quality, uses about 4.83 bits per weight, *recommended*. |
35
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q4_K_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q4_K_S.gguf) | Q4_K_S | 4.69GB | Slightly lower quality with more space savings, *recommended*. |
36
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ4_NL.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ4_NL.gguf) | IQ4_NL | 4.67GB | Decent quality, slightly smaller than Q4_K_S with similar performance *recommended*. |
37
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ4_XS.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ4_XS.gguf) | IQ4_XS | 4.44GB | Decent quality, smaller than Q4_K_S with similar performance, *recommended*. |
38
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_L.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_L.gguf) | Q3_K_L | 4.32GB | Lower quality but usable, good for low RAM availability. |
39
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_M.gguf) | Q3_K_M | 4.01GB | Even lower quality. |
40
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_M.gguf) | IQ3_M | 3.78GB | Medium-low quality, new method with decent performance comparable to Q3_K_M. |
41
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_S.gguf) | IQ3_S | 3.68GB | Lower quality, new method with decent performance, recommended over Q3_K_S quant, same size with better performance. |
42
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q3_K_S.gguf) | Q3_K_S | 3.66GB | Low quality, not recommended. |
43
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_XS.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_XS.gguf) | IQ3_XS | 3.51GB | Lower quality, new method with decent performance, slightly better than Q3_K_S. |
44
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_XXS.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ3_XXS.gguf) | IQ3_XXS | 3.27GB | Lower quality, new method with decent performance, comparable to Q3 quants. |
45
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-Q2_K.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-Q2_K.gguf) | Q2_K | 3.17GB | Very low quality but surprisingly usable. |
46
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_M.gguf) | IQ2_M | 2.94GB | Very low quality, uses SOTA techniques to also be surprisingly usable. |
47
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_S.gguf) | IQ2_S | 2.75GB | Very low quality, uses SOTA techniques to be usable. |
48
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_XS.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_XS.gguf) | IQ2_XS | 2.60GB | Very low quality, uses SOTA techniques to be usable. |
49
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_XXS.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ2_XXS.gguf) | IQ2_XXS | 2.39GB | Lower quality, uses SOTA techniques to be usable. |
50
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ1_M.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ1_M.gguf) | IQ1_M | 2.16GB | Extremely low quality, *not* recommended. |
51
- | [SFR-Iterative-DPO-LLaMA-3-8B-R-IQ1_S.gguf](https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF/blob/main/SFR-Iterative-DPO-LLaMA-3-8B-R-IQ1_S.gguf) | IQ1_S | 2.01GB | Extremely low quality, *not* recommended. |
52
 
53
  ## Downloading using huggingface-cli
54
 
@@ -61,16 +62,16 @@ pip install -U "huggingface_hub[cli]"
61
  Then, you can target the specific file you want:
62
 
63
  ```
64
- huggingface-cli download bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF --include "SFR-Iterative-DPO-LLaMA-3-8B-R-Q4_K_M.gguf" --local-dir ./ --local-dir-use-symlinks False
65
  ```
66
 
67
  If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run:
68
 
69
  ```
70
- huggingface-cli download bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF --include "SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0.gguf/*" --local-dir SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0 --local-dir-use-symlinks False
71
  ```
72
 
73
- You can either specify a new local-dir (SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0) or download them all in place (./)
74
 
75
  ## Which file should I choose?
76
 
 
2
  license: cc-by-nc-nd-3.0
3
  quantized_by: bartowski
4
  pipeline_tag: text-generation
5
+ base_model: RLHFlow/LLaMA3-iterative-DPO-final
6
  ---
7
 
8
+ ## Llamacpp imatrix Quantizations of LLaMA3-iterative-DPO-final
9
 
10
  Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2854">b2854</a> for quantization.
11
 
12
+ Original model: https://huggingface.co/RLHFlow/LLaMA3-iterative-DPO-final
13
 
14
  All quants made using imatrix option with dataset from [here](https://gist.github.com/bartowski1182/b6ac44691e994344625687afe3263b3a)
15
 
 
28
 
29
  | Filename | Quant type | File Size | Description |
30
  | -------- | ---------- | --------- | ----------- |
31
+ | [LLaMA3-iterative-DPO-final-Q8_0.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q8_0.gguf) | Q8_0 | 8.54GB | Extremely high quality, generally unneeded but max available quant. |
32
+ | [LLaMA3-iterative-DPO-final-Q6_K.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q6_K.gguf) | Q6_K | 6.59GB | Very high quality, near perfect, *recommended*. |
33
+ | [LLaMA3-iterative-DPO-final-Q5_K_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q5_K_M.gguf) | Q5_K_M | 5.73GB | High quality, *recommended*. |
34
+ | [LLaMA3-iterative-DPO-final-Q5_K_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q5_K_S.gguf) | Q5_K_S | 5.59GB | High quality, *recommended*. |
35
+ | [LLaMA3-iterative-DPO-final-Q4_K_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q4_K_M.gguf) | Q4_K_M | 4.92GB | Good quality, uses about 4.83 bits per weight, *recommended*. |
36
+ | [LLaMA3-iterative-DPO-final-Q4_K_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q4_K_S.gguf) | Q4_K_S | 4.69GB | Slightly lower quality with more space savings, *recommended*. |
37
+ | [LLaMA3-iterative-DPO-final-IQ4_NL.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ4_NL.gguf) | IQ4_NL | 4.67GB | Decent quality, slightly smaller than Q4_K_S with similar performance *recommended*. |
38
+ | [LLaMA3-iterative-DPO-final-IQ4_XS.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ4_XS.gguf) | IQ4_XS | 4.44GB | Decent quality, smaller than Q4_K_S with similar performance, *recommended*. |
39
+ | [LLaMA3-iterative-DPO-final-Q3_K_L.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q3_K_L.gguf) | Q3_K_L | 4.32GB | Lower quality but usable, good for low RAM availability. |
40
+ | [LLaMA3-iterative-DPO-final-Q3_K_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q3_K_M.gguf) | Q3_K_M | 4.01GB | Even lower quality. |
41
+ | [LLaMA3-iterative-DPO-final-IQ3_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ3_M.gguf) | IQ3_M | 3.78GB | Medium-low quality, new method with decent performance comparable to Q3_K_M. |
42
+ | [LLaMA3-iterative-DPO-final-IQ3_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ3_S.gguf) | IQ3_S | 3.68GB | Lower quality, new method with decent performance, recommended over Q3_K_S quant, same size with better performance. |
43
+ | [LLaMA3-iterative-DPO-final-Q3_K_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q3_K_S.gguf) | Q3_K_S | 3.66GB | Low quality, not recommended. |
44
+ | [LLaMA3-iterative-DPO-final-IQ3_XS.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ3_XS.gguf) | IQ3_XS | 3.51GB | Lower quality, new method with decent performance, slightly better than Q3_K_S. |
45
+ | [LLaMA3-iterative-DPO-final-IQ3_XXS.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ3_XXS.gguf) | IQ3_XXS | 3.27GB | Lower quality, new method with decent performance, comparable to Q3 quants. |
46
+ | [LLaMA3-iterative-DPO-final-Q2_K.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-Q2_K.gguf) | Q2_K | 3.17GB | Very low quality but surprisingly usable. |
47
+ | [LLaMA3-iterative-DPO-final-IQ2_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ2_M.gguf) | IQ2_M | 2.94GB | Very low quality, uses SOTA techniques to also be surprisingly usable. |
48
+ | [LLaMA3-iterative-DPO-final-IQ2_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ2_S.gguf) | IQ2_S | 2.75GB | Very low quality, uses SOTA techniques to be usable. |
49
+ | [LLaMA3-iterative-DPO-final-IQ2_XS.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ2_XS.gguf) | IQ2_XS | 2.60GB | Very low quality, uses SOTA techniques to be usable. |
50
+ | [LLaMA3-iterative-DPO-final-IQ2_XXS.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ2_XXS.gguf) | IQ2_XXS | 2.39GB | Lower quality, uses SOTA techniques to be usable. |
51
+ | [LLaMA3-iterative-DPO-final-IQ1_M.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ1_M.gguf) | IQ1_M | 2.16GB | Extremely low quality, *not* recommended. |
52
+ | [LLaMA3-iterative-DPO-final-IQ1_S.gguf](https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF/blob/main/LLaMA3-iterative-DPO-final-IQ1_S.gguf) | IQ1_S | 2.01GB | Extremely low quality, *not* recommended. |
53
 
54
  ## Downloading using huggingface-cli
55
 
 
62
  Then, you can target the specific file you want:
63
 
64
  ```
65
+ huggingface-cli download bartowski/LLaMA3-iterative-DPO-final-GGUF --include "LLaMA3-iterative-DPO-final-Q4_K_M.gguf" --local-dir ./ --local-dir-use-symlinks False
66
  ```
67
 
68
  If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run:
69
 
70
  ```
71
+ huggingface-cli download bartowski/LLaMA3-iterative-DPO-final-GGUF --include "LLaMA3-iterative-DPO-final-Q8_0.gguf/*" --local-dir LLaMA3-iterative-DPO-final-Q8_0 --local-dir-use-symlinks False
72
  ```
73
 
74
+ You can either specify a new local-dir (LLaMA3-iterative-DPO-final-Q8_0) or download them all in place (./)
75
 
76
  ## Which file should I choose?
77