L3-MOE-4x8B-Dark-Planet-Rising-25B / mergekit_moe_config.yml
DavidAU's picture
Upload folder using huggingface_hub
8937bfd verified
raw
history blame
332 Bytes
base_model: E:/L3-Dark-Planet-8B-wordstorm-cr2
gate_mode: random
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: E:/L3-Dark-Planet-8B-wordstorm-cr1
- source_model: E:/L3-Dark-Planet-8B-wordstorm-r7
- source_model: E:/L3-Dark-Planet-8B-wordstorm-r7
- source_model: E:/L3-Dark-Planet-8B-wordstorm-cr2