Text Generation
Transformers
Safetensors
mistral
Merge
mergekit
lazymergekit
liminerity/M7-7b
Kukedlc/NeuralSirKrishna-7b
Kukedlc/MyModelsMerge-7b
AurelPx/Percival_01-7b-slerp
MatthieuJ/Jason1903_SLERP
MTSAIR/multi_verse_model
Gille/StrangeMerges_30-7B-slerp
chihoonlee10/T3Q-Mistral-Orca-Math-DPO
yam-peleg/Experiment28-7B
mlabonne/UltraMerge-7B
text-generation-inference
Inference Endpoints
File size: 1,120 Bytes
6cef70c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
models:
- model: liminerity/M7-7b
# no parameters necessary for base model
- model: liminerity/M7-7b
parameters:
weight: 0.2
density: 0.88
- model: Kukedlc/NeuralSirKrishna-7b
parameters:
weight: 0.1
density: 0.66
- model: Kukedlc/MyModelsMerge-7b
parameters:
weight: 0.1
density: 0.66
- model: AurelPx/Percival_01-7b-slerp
parameters:
weight: 0.1
density: 0.33
- model: MatthieuJ/Jason1903_SLERP
parameters:
weight: 0.1
density: 0.33
- model: MTSAIR/multi_verse_model
parameters:
weight: 0.1
density: 0.66
- model: Gille/StrangeMerges_30-7B-slerp
parameters:
weight: 0.1
density: 0.55
- model: chihoonlee10/T3Q-Mistral-Orca-Math-DPO
parameters:
weight: 0.1
density: 0.22
- model: yam-peleg/Experiment28-7B
parameters:
weight: 0.1
density: 0.44
- model: mlabonne/UltraMerge-7B
parameters:
weight: 0.1
density: 0.77
merge_method: dare_ties
base_model: liminerity/M7-7b
parameters:
int8_mask: true
normalize: true
dtype: bfloat16
|