Update README.md
Browse files
README.md
CHANGED
@@ -13,13 +13,15 @@ base_model: []
|
|
13 |
This repo contains the full precision source code, in "safe tensors" format to generate GGUFs, GPTQ, EXL2, AWQ, HQQ and other formats.
|
14 |
The source code can also be used directly.
|
15 |
|
|
|
|
|
16 |
NOTE: Links to GGUFs below.
|
17 |
|
18 |
<B>IMPORTANT: Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
|
19 |
|
20 |
If you are going to use this model, (source, GGUF or a different quant), please review this document for critical parameter, sampler and advance sampler settings (for multiple AI/LLM aps).
|
21 |
|
22 |
-
This a "Class
|
23 |
|
24 |
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) (especially for use case(s) beyond the model's design) please see:
|
25 |
|
|
|
13 |
This repo contains the full precision source code, in "safe tensors" format to generate GGUFs, GPTQ, EXL2, AWQ, HQQ and other formats.
|
14 |
The source code can also be used directly.
|
15 |
|
16 |
+
This source is in float 32 precision. If you are going to quant it as a GGUF, make sure the "--outtype f32" is set (during "convert..." step) , so the ggufs benefit from the f32 source.
|
17 |
+
|
18 |
NOTE: Links to GGUFs below.
|
19 |
|
20 |
<B>IMPORTANT: Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
|
21 |
|
22 |
If you are going to use this model, (source, GGUF or a different quant), please review this document for critical parameter, sampler and advance sampler settings (for multiple AI/LLM aps).
|
23 |
|
24 |
+
This a "Class 2" (settings will enhance operation) model:
|
25 |
|
26 |
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) (especially for use case(s) beyond the model's design) please see:
|
27 |
|