--- base_model: [] library_name: transformers tags: - mergekit - merge --- **B-NIMITA** is an AI model designed to bring role-playing scenarios to life with emotional depth and rich storytelling. At its core is NIHAPPY, providing a solid narrative foundation and contextual consistency. This is enhanced by Mythorica, which adds vivid emotional arcs and expressive dialogue, and V-Blackroot, ensuring character consistency and subtle adaptability. This combination allows B-NIMITA to deliver dynamic, engaging interactions that feel natural and immersive. * Recomended ST Presets: **[Domain Fusion Presets:](https://huggingface.co/ChaoticNeutrals/Domain-Fusion-L3-8B/tree/main/Domain_Fusion-Presets)** --- # output-model-directory This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using **[Arkana08/NIHAPPY-L3.1-8B-v0.09](https://huggingface.co/Arkana08/NIHAPPY-L3.1-8B-v0.09)** as a base. ### Models Merged The following models were included in the merge: * **[Hastagaras/Jamet-8B-L3-MK.V-Blackroot](https://huggingface.co/Hastagaras/Jamet-8B-L3-MK.V-Blackroot)** * **[Arkana08/NIHAPPY-L3.1-8B-v0.09](https://huggingface.co/Arkana08/NIHAPPY-L3.1-8B-v0.09)** * **[Arkana08/Mythorica-L3-8B](https://huggingface.co/Arkana08/Mythorica-L3-8B)** ### Configuration - **Primary Model**: **NIHAPPY** (base) - Balancing core narrative flow and contextual awareness. Additional Models: - **Mythorica** - Enhanced expressive flair, strong emotional arcs, detailed dialogue. - **V-Blackroot** - Precise focus on character consistency, subtle emotional undertones, adaptability in scene development. The following YAML configuration was used to produce this model: ```yaml models: - model: Arkana08/Mythorica-L3-8B parameters: weight: 0.4 density: 0.6 - model: Arkana08/NIHAPPY-L3.1-8B-v0.09 parameters: weight: 0.35 density: 0.7 - model: Hastagaras/Jamet-8B-L3-MK.V-Blackroot parameters: weight: 0.25 density: 0.55 merge_method: dare_ties base_model: Arkana08/NIHAPPY-L3.1-8B-v0.09 parameters: int8_mask: true dtype: bfloat16 ``` ## Credits Thanks to the creators of the models: * **[Hastagaras/Jamet-8B-L3-MK.V-Blackroot](https://huggingface.co/Hastagaras/Jamet-8B-L3-MK.V-Blackroot)**