File size: 498 Bytes
76fcb6d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1fb4838
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: apache-2.0
language:
- en
---
<div align="center">

# TinyMix-8x1b
</div>

This model is MoE consisting of 8 experts of  [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T)

This model is untrained, and will likely perform worse than the dense version.

Will start training it very soon.

Idea by eastwind, who did it for the [chat version of the model](https://huggingface.co/eastwind/tinymix-8x1b-chat).