metadata
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- openaccess-ai-collective/tiny-mistral
base_model:
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
test_tiny_mixtral_only_router
test_tiny_mixtral_only_router is a Mixure of Experts (MoE) made with the following models using a modified version of mergekit.
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
🧩 Configuration
base_model: openaccess-ai-collective/tiny-mistral
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "math"
# You can add negative_prompts if needed
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "science"
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "writing"
# You can add negative_prompts if needed
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "general"
This is a test version of arcee-ai's hidden state model. It is a router for a frankenMoE instead of the entire MoE itself