Update README.md
Browse files
README.md
CHANGED
@@ -57,7 +57,10 @@ The image shows a summary of model merging approach, constructing larger models
|
|
57 |
### Mixture-of-Experts models
|
58 |
|
59 |
- [lamm-mit/Cephalo-Phi-3-MoE-vision-128k-3x4b-beta](https://huggingface.co/lamm-mit/Cephalo-Phi-3-MoE-vision-128k-3x4b-beta)
|
60 |
-
- Mixture-of-expert model based on several smaller Cephalo-Phi-3 models.
|
|
|
|
|
|
|
61 |
|
62 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/623ce1c6b66fedf374859fe7/NK9KNOxmnVtn_PzwJtKPR.png)
|
63 |
|
|
|
57 |
### Mixture-of-Experts models
|
58 |
|
59 |
- [lamm-mit/Cephalo-Phi-3-MoE-vision-128k-3x4b-beta](https://huggingface.co/lamm-mit/Cephalo-Phi-3-MoE-vision-128k-3x4b-beta)
|
60 |
+
- Mixture-of-expert model based on several smaller Cephalo-Phi-3 models. Provides a sample cookbook to make your own custom MoE vision models.
|
61 |
+
|
62 |
+
- [lamm-mit/Cephalo-Idefics2-vision-3x8b-beta](https://huggingface.co/lamm-mit/Cephalo-Idefics2-vision-3x8b-beta)
|
63 |
+
- Mixture-of-expert model based on several smaller Idefics-2 models. Provides a sample cookbook to make your own custom MoE vision models.
|
64 |
|
65 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/623ce1c6b66fedf374859fe7/NK9KNOxmnVtn_PzwJtKPR.png)
|
66 |
|