Crystalcareai
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -4,6 +4,9 @@ license_name: gemma-terms-of-use
|
|
4 |
license_link: https://ai.google.dev/gemma/terms
|
5 |
---
|
6 |
|
|
|
|
|
|
|
7 |
# GemMoE: An 8x8 Mixture Of Experts based on Gemma.
|
8 |
|
9 |
I am delighted to finally be able to share the beta release of GemMoE, a project that has been a labor of love and a testament to the power of collaboration within the AI community. GemMoE is a Mixture of Experts (MoE) model that combines the strength of Deepmind's Gemma architecture with a custom-tailored approach to enable easy training and inference.
|
|
|
4 |
license_link: https://ai.google.dev/gemma/terms
|
5 |
---
|
6 |
|
7 |
+
Note: If you wish to use GemMoE, you must install transformers from my branch using ```pip install git+https://github.com/Crystalcareai/transformers.git@GemMoE#egg=transformers```
|
8 |
+
|
9 |
+
|
10 |
# GemMoE: An 8x8 Mixture Of Experts based on Gemma.
|
11 |
|
12 |
I am delighted to finally be able to share the beta release of GemMoE, a project that has been a labor of love and a testament to the power of collaboration within the AI community. GemMoE is a Mixture of Experts (MoE) model that combines the strength of Deepmind's Gemma architecture with a custom-tailored approach to enable easy training and inference.
|