license: apache-2.0 | |
tags: | |
- moe | |
- frankenmoe | |
- merge | |
- mergekit | |
- lazymergekit | |
- mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp | |
- mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp | |
base_model: | |
- mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp | |
- mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp | |
# Einstein-4D-MoE-2x7b-test | |
Einstein-4D-MoE-2x7b-test is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): | |
* [mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp](https://huggingface.co/mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp) | |
* [mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp](https://huggingface.co/mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp) | |
## 🧩 Configuration | |
## 💻 Usage |