Update README.md
Browse files
README.md
CHANGED
@@ -16,6 +16,29 @@ Llama3-8B-abliterated-Spectrum-slerp is a merge of the following models using [L
|
|
16 |
* [yuvraj17/Llama-3-8B-spectrum-25](https://huggingface.co/yuvraj17/Llama-3-8B-spectrum-25)
|
17 |
* [mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated](https://huggingface.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated)
|
18 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19 |
## 🧩 Configuration
|
20 |
|
21 |
```yaml
|
@@ -61,4 +84,12 @@ pipeline = transformers.pipeline(
|
|
61 |
|
62 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
63 |
print(outputs[0]["generated_text"])
|
64 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
* [yuvraj17/Llama-3-8B-spectrum-25](https://huggingface.co/yuvraj17/Llama-3-8B-spectrum-25)
|
17 |
* [mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated](https://huggingface.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated)
|
18 |
|
19 |
+
## Introduction for Model Merging
|
20 |
+
|
21 |
+
**Model Merging**, also known as model fusion, is an effective technique that merges the parameters of multiple separate models with different capabilities to build a universal model without needing access to the
|
22 |
+
original training data or expensive computation.
|
23 |
+
There are bunch of methods, we can use to merge the capabilities of different models (supported by [mergekit](https://github.com/arcee-ai/mergekit)) including:
|
24 |
+
|
25 |
+
<figure>
|
26 |
+
|
27 |
+
<img src="https://cdn-uploads.huggingface.co/production/uploads/66137d95e8d2cda230ddcea6/HUflk1elPEom3Pe_vU_Ku.png" width="768" height="768">
|
28 |
+
<figcaption> Merge Methods supported by MergeKit <a href="//github.com/arcee-ai/mergekit?tab=readme-ov-file#merge-methods">Reference</a> </figcaption>
|
29 |
+
|
30 |
+
</figure>
|
31 |
+
|
32 |
+
For more deep-diving into different merging techniques, visit [Merge Large Language Models with mergekit](https://towardsdatascience.com/merge-large-language-models-with-mergekit-2118fb392b54).
|
33 |
+
|
34 |
+
### Introduction for SLERP Merging
|
35 |
+
|
36 |
+
**Spherical Linear Interpolation (SLERP)** is a method used to smoothly interpolate between two vectors. It maintains a constant rate of change and preserves the geometric properties of the spherical space in which the vectors reside.
|
37 |
+
|
38 |
+
SLERP is currently the *most-popular merging method*, preffered over traditional methods because instead of dealing with straight-lines, the interpolation occurs on the surface of a sphere, and it has achieved improved performance to very diverse task.
|
39 |
+
> But SLERP is limited to combining only **two models at a time**, although its possible to hierarchically combine multiple models, as shown in [Mistral-7B-Merge-14-v0.1](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1).
|
40 |
+
|
41 |
+
|
42 |
## 🧩 Configuration
|
43 |
|
44 |
```yaml
|
|
|
84 |
|
85 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
86 |
print(outputs[0]["generated_text"])
|
87 |
+
```
|
88 |
+
|
89 |
+
|
90 |
+
## 🏆 Evaluation Results
|
91 |
+
Coming soon
|
92 |
+
|
93 |
+
## Special thanks & Reference
|
94 |
+
- Maxime Labonne for their easy-to-use colab-notebook [Merging LLMs with MergeKit](https://github.com/mlabonne/llm-course/blob/main/Mergekit.ipynb) and [Blog](https://towardsdatascience.com/merge-large-language-models-with-mergekit-2118fb392b54)
|
95 |
+
- Authors of [Mergekit](https://github.com/arcee-ai/mergekit)
|