fairseq-dense-1.3B / README.md
ve-forbryderne's picture
Add basic model information
6e19693
|
raw
history blame
408 Bytes
---
language: en
---
This is a Hugging Face transformers-compatible conversion of the original dense 1.3B-parameter model from the paper "[Efficient Large Scale Language Modeling with Mixtures of Experts](https://arxiv.org/abs/2112.10684)" from Artetxe et al. Please refer to the original model card, which can be found at https://github.com/facebookresearch/fairseq/blob/main/examples/moe_lm/model_card.md.