jubueche commited on
Commit
570a46c
1 Parent(s): 3dfe707

Update README.md

Browse files

Small Mixture of Experts (MoE) trained on pure language modelling task (WikiText-103, document level). This MoE was presented in [this paper](https://aclanthology.org/2023.findings-emnlp.49/).

Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -1,3 +1,9 @@
1
  ---
2
  license: apache-2.0
3
- ---
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - EleutherAI/wikitext_document_level
5
+ language:
6
+ - en
7
+ metrics:
8
+ - perplexity
9
+ ---