mlabonne commited on
Commit
9a67aa9
1 Parent(s): 83d8a3c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -21,6 +21,10 @@ tags:
21
 
22
  phixtral-4x2_8 is the first Mixure of Experts (MoE) made with four [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) models, inspired by the [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) architecture. It performs better than each individual expert.
23
 
 
 
 
 
24
  ## 🏆 Evaluation
25
 
26
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
 
21
 
22
  phixtral-4x2_8 is the first Mixure of Experts (MoE) made with four [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) models, inspired by the [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) architecture. It performs better than each individual expert.
23
 
24
+ ## ⚡ Quantized models
25
+
26
+ * **GPTQ**: https://huggingface.co/TheBloke/phixtral-4x2_8-GPTQ
27
+
28
  ## 🏆 Evaluation
29
 
30
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|