maddes8cht commited on
Commit
145730e
1 Parent(s): 954d585

"Update README.md"

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -53,7 +53,7 @@ The core project making use of the ggml library is the [llama.cpp](https://githu
53
 
54
  There is a bunch of quantized files available. How to choose the best for you:
55
 
56
- # legacy quants
57
 
58
  Q4_0, Q4_1, Q5_0, Q5_1 and Q8 are `legacy` quantization types.
59
  Nevertheless, they are fully supported, as there are several circumstances that cause certain model not to be compatible with the modern K-quants.
@@ -67,6 +67,7 @@ With a Q6_K you should find it really hard to find a quality difference to the o
67
 
68
 
69
 
 
70
  # Original Model Card:
71
  <p align="center" width="100%">
72
  <img src="https://huggingface.co/bofenghuang/vigogne-falcon-7b-instruct/resolve/main/vigogne_logo.png" alt="Vigogne" style="width: 40%; min-width: 300px; display: block; margin: auto;">
@@ -128,6 +129,7 @@ You can also infer this model by using the following Google Colab Notebook.
128
  Vigogne is still under development, and there are many limitations that have to be addressed. Please note that it is possible that the model generates harmful or biased content, incorrect information or generally unhelpful answers.
129
 
130
  ***End of original Model File***
 
131
 
132
 
133
  ## Please consider to support my work
 
53
 
54
  There is a bunch of quantized files available. How to choose the best for you:
55
 
56
+ # Legacy quants
57
 
58
  Q4_0, Q4_1, Q5_0, Q5_1 and Q8 are `legacy` quantization types.
59
  Nevertheless, they are fully supported, as there are several circumstances that cause certain model not to be compatible with the modern K-quants.
 
67
 
68
 
69
 
70
+ ---
71
  # Original Model Card:
72
  <p align="center" width="100%">
73
  <img src="https://huggingface.co/bofenghuang/vigogne-falcon-7b-instruct/resolve/main/vigogne_logo.png" alt="Vigogne" style="width: 40%; min-width: 300px; display: block; margin: auto;">
 
129
  Vigogne is still under development, and there are many limitations that have to be addressed. Please note that it is possible that the model generates harmful or biased content, incorrect information or generally unhelpful answers.
130
 
131
  ***End of original Model File***
132
+ ---
133
 
134
 
135
  ## Please consider to support my work