Update README.md
Browse files
README.md
CHANGED
@@ -7,9 +7,9 @@ license: apache-2.0
|
|
7 |
HelixNet-LMoE is a simple LoRA based Mixture of Experts version of the [HelixNet](https://huggingface.co/migtissera/HelixNet) 3-model system by [Migel Tissera](https://huggingface.co/migtissera).
|
8 |
|
9 |
For each HelixNet model, a separate LoRA adapter was extracted :
|
10 |
-
* [HelixNet-LMoE-Actor](rhysjones/HelixNet-LMoE-Actor)
|
11 |
-
* [HelixNet-LMoE-Critic](rhysjones/HelixNet-LMoE-Critic)
|
12 |
-
* [HelixNet-LMoE-Regenerator](rhysjones/HelixNet-LMoE-Regenerator)
|
13 |
|
14 |
These are then loaded togeter with the base [Mistral 7b](https://huggingface.co/mistralai/Mistral-7B-v0.1) model to give the combined LMoE model.
|
15 |
|
|
|
7 |
HelixNet-LMoE is a simple LoRA based Mixture of Experts version of the [HelixNet](https://huggingface.co/migtissera/HelixNet) 3-model system by [Migel Tissera](https://huggingface.co/migtissera).
|
8 |
|
9 |
For each HelixNet model, a separate LoRA adapter was extracted :
|
10 |
+
* [HelixNet-LMoE-Actor](https://huggingface.co/rhysjones/HelixNet-LMoE-Actor)
|
11 |
+
* [HelixNet-LMoE-Critic](https://huggingface.co/rhysjones/HelixNet-LMoE-Critic)
|
12 |
+
* [HelixNet-LMoE-Regenerator](https://huggingface.co/rhysjones/HelixNet-LMoE-Regenerator)
|
13 |
|
14 |
These are then loaded togeter with the base [Mistral 7b](https://huggingface.co/mistralai/Mistral-7B-v0.1) model to give the combined LMoE model.
|
15 |
|