|
--- |
|
title: README |
|
emoji: π |
|
colorFrom: gray |
|
colorTo: yellow |
|
sdk: static |
|
pinned: true |
|
license: apache-2.0 |
|
short_description: Description of the Mula project. |
|
--- |
|
<div align="center"> |
|
|
|
# Mula: a Sparse Mixture of Experts Language Model trained in Brazilian Portuguese |
|
|
|
</div> |
|
<p align="center"> |
|
<img src="./logo-no-bg.png" alt="Mula" height="400"> |
|
</p> |
|
|
|
Mula is a series of Sparse Mixture of Experts (SMoE) language models, all trained natively in Brazilian Portuguese, designed to help democratize LLMs for low-resource languages. |
|
|
|
Models and datasets are coming soon... |