mixtral-soup / README.md
monology's picture
Upload 10 files
47ede81 verified
|
raw
history blame
No virus
1.69 kB
metadata
base_model:
  - monology/mixtral-expert7
  - monology/mixtral-expert5
  - monology/mixtral-expert6
  - monology/mixtral-expert0
  - monology/mixtral-expert4
  - monology/mixtral-expert1
  - monology/mixtral-expert3
  - monology/mixtral-expert2
library_name: transformers
tags:
  - mergekit
  - merge

mixtral-soup

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the linear merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: monology/mixtral-expert0
  - model: monology/mixtral-expert1
  - model: monology/mixtral-expert2
  - model: monology/mixtral-expert3
  - model: monology/mixtral-expert4
  - model: monology/mixtral-expert5
  - model: monology/mixtral-expert6
  - model: monology/mixtral-expert7
parameters:
    weight: 1.0
merge_method: linear
dtype: float16