File size: 217 Bytes
d01d12a
 
 
16e8f72
 
 
 
 
 
1
2
3
4
5
6
7
8
9
---
license: bigcode-openrail-m
---

# OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models

Please see this [link](https://github.com/XueFuzhao/OpenMoE/tree/main) for detailed information.