liminerity/Memgpt-3x7b-MOE AWQ

Model Summary

Memgpt-3x7b-MOE is a Mixure of Experts (MoE) made with the following models using LazyMergekit:

Downloads last month
23
Safetensors
Model size
2.7B params
Tensor type
I32
·
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for solidrust/Memgpt-3x7b-MOE-AWQ

Quantized
(1)
this model

Collection including solidrust/Memgpt-3x7b-MOE-AWQ