CodeCalc-Mistral-7B / README.md
sethuiyer's picture
Update README.md
e03e7b8 verified
|
raw
history blame
898 Bytes
metadata
base_model:
  - uukuguy/speechless-code-mistral-7b-v1.0
  - upaya07/Arithmo2-Mistral-7B
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0
language:
  - en
pipeline_tag: text-generation

CodeCalc-Mistral-7B

CodeCalc

This is a merge of pre-trained language models created using mergekit.

Configuration

The following YAML configuration was used to produce this model:


base_model: uukuguy/speechless-code-mistral-7b-v1.0
dtype: bfloat16
merge_method: ties
models:
- model: uukuguy/speechless-code-mistral-7b-v1.0
- model: upaya07/Arithmo2-Mistral-7B
  parameters:
    density:  [0.25, 0.35, 0.45, 0.35, 0.25]
    weight: [0.1, 0.25, 0.5, 0.25, 0.1]
parameters:
  int8_mask: true