tinymix-8x1b / README.md
mhenrichsen's picture
Update README.md
1fb4838
|
raw
history blame
498 Bytes
metadata
license: apache-2.0
language:
  - en

TinyMix-8x1b

This model is MoE consisting of 8 experts of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T

This model is untrained, and will likely perform worse than the dense version.

Will start training it very soon.

Idea by eastwind, who did it for the chat version of the model.