File size: 754 Bytes
0052f35
 
 
 
 
 
 
 
55395ff
 
0052f35
 
 
4ab20e1
 
0052f35
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
tags:
- moe
- llama
- '3'
- llama 3
- 2x8b
---
<img src="https://i.imgur.com/eFrFD6v.jpeg" alt="drawing" width="640"/>

# Llama-3-Teal-Instruct-2x8B-MoE
This is a experimental MoE created from [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) and [nvidia/Llama3-ChatQA-1.5-8B](https://huggingface.co/nvidia/Llama3-ChatQA-1.5-8B) using Mergekit.

Green + Blue = Teal.

Mergekit yaml file:
```
base_model: Meta-Llama-3-8B-Instruct
experts:
  - source_model: Meta-Llama-3-8B-Instruct
    positive_prompts:
    - "explain"
    - "chat"
    - "assistant"
  - source_model: Llama3-ChatQA-1.5-8B
    positive_prompts:
    - "python"
    - "math"
    - "solve"
    - "code"
gate_mode: hidden
dtype: float16
```