sometimesanotion
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,9 @@ language:
|
|
20 |
|
21 |
### Overview:
|
22 |
|
23 |
-
Lamarck-14B version 0.3 is the product of a carefully planned sequence of templated merges. It is broadly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small), with contributions from highly-ranked portions of other models prose and reasoning.
|
|
|
|
|
24 |
|
25 |
**The merge strategy of Lamarck 0.3 can be summarized as:**
|
26 |
|
@@ -57,9 +59,9 @@ Lamarck-14B version 0.3 is the product of a carefully planned sequence of templa
|
|
57 |
|
58 |
**Model stock:** Two model_stock merges, specialized for specific aspects of performance, are used to mildly influence a large range of the model.
|
59 |
|
60 |
-
- **[sometimesanotion/lamarck-14b-reason-model_stock](https://huggingface.co/sometimesanotion/lamarck-14b-reason-model_stock)** - This
|
61 |
|
62 |
-
- **[sometimesanotion/lamarck-14b-prose-model_stock](https://huggingface.co/sometimesanotion/lamarck-14b-prose-model_stock)** -
|
63 |
|
64 |
### Configuration:
|
65 |
|
|
|
20 |
|
21 |
### Overview:
|
22 |
|
23 |
+
Lamarck-14B version 0.3 is the product of a carefully planned sequence of templated merges. It is broadly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small), with contributions from highly-ranked portions of other models prose and reasoning.
|
24 |
+
|
25 |
+
It benefits from @CultriX's use of evolutionary merge processes, which its toolchain is being designed to expand on, hence, it's named after early biologist Jean-Baptiste Lamarck.
|
26 |
|
27 |
**The merge strategy of Lamarck 0.3 can be summarized as:**
|
28 |
|
|
|
59 |
|
60 |
**Model stock:** Two model_stock merges, specialized for specific aspects of performance, are used to mildly influence a large range of the model.
|
61 |
|
62 |
+
- **[sometimesanotion/lamarck-14b-reason-model_stock](https://huggingface.co/sometimesanotion/lamarck-14b-reason-model_stock)** - This means [VAGOsolutions/SauerkrautLM-v2-14b-DPO](https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-DPO) has a contribution, most likely noticeable in BBH
|
63 |
|
64 |
+
- **[sometimesanotion/lamarck-14b-prose-model_stock](https://huggingface.co/sometimesanotion/lamarck-14b-prose-model_stock)** - This brings in a little influence from [EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2), [oxyapi/oxy-1-small](https://huggingface.co/oxyapi/oxy-1-small), [allura-org/TQ2.5-14B-Sugarquill-v1](https://huggingface.co/allura-org/TQ2.5-14B-Sugarquill-v1).
|
65 |
|
66 |
### Configuration:
|
67 |
|