OpenMoE_8B / README.md
fuzhao's picture
Update README.md
23ecc67
|
raw
history blame
217 Bytes
---
license: bigcode-openrail-m
---
# OpenMoE
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
Please see this [link](https://github.com/XueFuzhao/OpenMoE/tree/main) for detailed information.