--- base_model: - microsoft/phi-3.5-mini-instruct - AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common library_name: transformers tags: - mergekit - merge license: mit license_link: https://huggingface.co/microsoft/Phi-3.5-mini-instruct/resolve/main/LICENSE language: - en - ja inference: true pipeline_tag: text-generation widget: - messages: - role: user content: こんにちは! - messages: - role: user content: 魚を捌くのは難しいですか? - messages: - role: user content: ナイジェリアの首都はどこですか? - messages: - role: user content: hello! - messages: - role: user content: 貝は砂浜に落ちてるものですか? - messages: - role: user content: おはようございます。 - messages: - role: user content: 錫はどういうものに使われますか? - messages: - role: user content: 露骨とあからさまが違う言葉であることを証明してください。 - messages: - role: user content: 你好 - messages: - role: user content: 魚を捌くのは難しいですか? - messages: - role: user content: Où se trouve Shinjuku ? - messages: - role: user content: Bonjour! --- This project utilizes HODACHI/Borea-Phi-3.5-mini-Instruct-Common, a model based on Phi-3.5-mini-Instruct and fine-tuned by Axcxept co., ltd. # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [microsoft/phi-3.5-mini-instruct](https://huggingface.co/microsoft/phi-3.5-mini-instruct) as a base. ### Models Merged The following models were included in the merge: * [AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common](https://huggingface.co/AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common parameters: weight: 1 density: 1 merge_method: ties base_model: microsoft/phi-3.5-mini-instruct parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: float16 ```