sometimesanotion
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -24,6 +24,8 @@ Lamarck-14B is a carefully designed merge which emphasizes [arcee-ai/Virtuoso-Sm
|
|
24 |
|
25 |
Its reasoning and prose skills are quite strong. Version 0.3 is the product of a carefully planned and tested sequence of templated merges, produced by a toolchain which wraps around Arcee's mergekit.
|
26 |
|
|
|
|
|
27 |
**The merge strategy of Lamarck 0.3 can be summarized as:**
|
28 |
|
29 |
- Two model_stocks commence specialized branches for reasoning and prose quality.
|
|
|
24 |
|
25 |
Its reasoning and prose skills are quite strong. Version 0.3 is the product of a carefully planned and tested sequence of templated merges, produced by a toolchain which wraps around Arcee's mergekit.
|
26 |
|
27 |
+
For GGUFs, [mradermacher/Lamarck-14B-v0.3-i1-GGUF](https://huggingface.co/mradermacher/Lamarck-14B-v0.3-i1-GGUF) has you covered. Thank you @mradermacher!
|
28 |
+
|
29 |
**The merge strategy of Lamarck 0.3 can be summarized as:**
|
30 |
|
31 |
- Two model_stocks commence specialized branches for reasoning and prose quality.
|