--- base_model: - SteelSkull/MSM-MS-Cydrion-22B - knifeayumu/Cydonia-v1.3-Magnum-v4-22B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [SteelSkull/MSM-MS-Cydrion-22B](https://huggingface.co/SteelSkull/MSM-MS-Cydrion-22B) * [knifeayumu/Cydonia-v1.3-Magnum-v4-22B](https://huggingface.co/knifeayumu/Cydonia-v1.3-Magnum-v4-22B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: knifeayumu/Cydonia-v1.3-Magnum-v4-22B - model: SteelSkull/MSM-MS-Cydrion-22B merge_method: slerp base_model: SteelSkull/MSM-MS-Cydrion-22B dtype: bfloat16 parameters: t: [0.030, 0.056, 0.093, 0.144, 0.224, 0.312, 0.409, 0.525, 0.614, 0.589, 0.559, 0.521, 0.437, 0.339, 0.230] # V shaped curve: Hermes for input & output, WizardMath in the middle layers ```