--- language: - en base_model: - Krisbiantoro/mistral7b_dpo_en library_name: transformers tags: - mergekit - merge - quantized - 4-bit - AWQ - transformers - pytorch - mistral - text-generation - conversational - autotrain_compatible - endpoints_compatible - text-generation-inference - chatml license: other model_creator: jeiku model_name: Mewthree-7B model_type: mistral pipeline_tag: text-generation inference: false prompt_template: '<|im_start|>system {system_message}<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ' quantized_by: Suparious --- # jeiku/Mewthree-7B AWQ - Model creator: [jeiku](https://huggingface.co/jeiku) - Original model: [Mewthree-7B](https://huggingface.co/jeiku/Mewthree_7B) ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/mfFtubKCh143741_enqN8.jpeg) ## Model Summary Draws upon the Prodigy lineage with some no robots tossed in for good measure. Dipped its toes in some memerboard essence and added a kiss of BioMistral for anatomy. Applied a DPO LoRA over top. Seems to do markdown well. It's an overall balanced model with a focus on RP.