File size: 581 Bytes
c1ed89a 215420e c1ed89a 215420e c1ed89a 215420e c1ed89a 215420e 613abf5 215420e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
---
title: README
emoji: 🐎
colorFrom: gray
colorTo: yellow
sdk: static
pinned: true
license: apache-2.0
short_description: Description of the Mula project.
---
<div align="center">
# Mula: a Sparse Mixture of Experts Language Model trained in Brazilian Portuguese
</div>
<p align="center">
<img src="./logo-no-bg.png" alt="Mula" height="400">
</p>
Mula is a series of Sparse Mixture of Experts (SMoE) language models, all trained natively in Brazilian Portuguese, designed to help democratize LLMs for low-resource languages.
Models and datasets are coming soon... |