metadata
base_model:
- deepseek-ai/deepseek-coder-6.7b-instruct
- m-a-p/OpenCodeInterpreter-DS-6.7B
- deepseek-ai/deepseek-coder-6.7b-base
library_name: transformers
tags:
- mergekit
- merge
output-model-directory
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using deepseek-ai/deepseek-coder-6.7b-base as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: deepseek-ai/deepseek-coder-6.7b-instruct
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: m-a-p/OpenCodeInterpreter-DS-6.7B
parameters:
density: 0.5
weight: [0, 0.3, 0.7, 1] # weight gradient
# - model: NousResearch/Nous-Hermes-llama-2-7b
# parameters:
# density: 0.33
# weight:
# - filter: mlp
# value: 0.5
# - value: 0
merge_method: ties
base_model: deepseek-ai/deepseek-coder-6.7b-base
parameters:
normalize: true
int8_mask: true
dtype: float16