RP-Llama-4x8B-MoE / README.md
WesPro's picture
Create README.md
b52c1c5 verified
|
raw
history blame
350 Bytes

This is a my first Llama 3 MoE model with the following configs:

base_model: Llama-3-RPMerge-8B-SLERP experts:

  • source_model: Llama-3-RPMerge-8B-SLERP
  • source_model: WesPro_Daring_Llama
  • source_model: Chaos_RP_l3_8B
  • source_model: llama-3-stinky-8B

It's meant for RP and does pretty well at it but I haven't tested it excessively yet...