--- base_model: [] library_name: transformers tags: - mergekit - merge --- # Dazzling-Star-Aurora-32b-v0.0 *If somewhere amid that aimlessly drifting sky,There was a planet where our wishes could flow free... would we try to make it there? I wonder what we'd wish for if we did...~* Listen to the song on youtube: https://www.youtube.com/watch?v=e1EExQiRhC0 Story behind it: Bored at midnight, decided to create a merge I guess, resulting in this model, I like it, so try it out? Models: - EVA-UNIT-01/EVA-Qwen2.5-32B-v0.2 - ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3 - Qwen/Qwen2.5-32B # Instruct Format: ChatML Thank you to AuriAetherwiing for providing compute and helping merge the models. ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using Qwen_Qwen2.5-32B as a base. ### Models Merged The following models were included in the merge: * ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3 * EVA-UNIT-01/EVA-Qwen2.5-32B-v0.2 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: EVA-UNIT-01/EVA-Qwen2.5-32B-v0.2 parameters: weight: 0.3 density: 0.7 - model: ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3 parameters: weight: 0.4 density: 0.8 base_model: Qwen/Qwen2.5-32B parameters: epsilon: 0.05 lambda: 1 int8_mask: true normalize: true merge_method: ties dtype: bfloat16 ```