Crystalcareai commited on
Commit
3511011
1 Parent(s): 4cd5fac

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -4,6 +4,9 @@ license_name: gemma-terms-of-use
4
  license_link: https://ai.google.dev/gemma/terms
5
  ---
6
 
 
 
 
7
  # GemMoE: An 8x8 Mixture Of Experts based on Gemma.
8
 
9
  I am delighted to finally be able to share the beta release of GemMoE, a project that has been a labor of love and a testament to the power of collaboration within the AI community. GemMoE is a Mixture of Experts (MoE) model that combines the strength of Deepmind's Gemma architecture with a custom-tailored approach to enable easy training and inference.
 
4
  license_link: https://ai.google.dev/gemma/terms
5
  ---
6
 
7
+ Note: If you wish to use GemMoE, you must install transformers from my branch using ```pip install git+https://github.com/Crystalcareai/transformers.git@GemMoE#egg=transformers```
8
+
9
+
10
  # GemMoE: An 8x8 Mixture Of Experts based on Gemma.
11
 
12
  I am delighted to finally be able to share the beta release of GemMoE, a project that has been a labor of love and a testament to the power of collaboration within the AI community. GemMoE is a Mixture of Experts (MoE) model that combines the strength of Deepmind's Gemma architecture with a custom-tailored approach to enable easy training and inference.