File size: 2,321 Bytes
67fb181
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
base_model: []
tags:
- mergekit
- Etheria
license: apache-2.0
---
# Steelskull/Etheria-55b-v0.1

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/RAhrbktyyVQxOR1np-9L2.png)

## Merge Details

An attempt to make a functional goliath style merge to create a [Etheria] 55b-200k with two yi-34b-200k models.

due to the merge it 'theoretically' should have a context of 200k but I recommend starting at 32k and moveing up,
as it is unknown (at this time) what the merge has done to the context length.

This is a merge of both VerA and VerB of Etheria-55b (There numbers were surprisingly good), I then created a sacrificial 55B out of the most performant yi-34b-200k Model
and performed a Dare_ties merge and equalize the model into its current state.

### recommended settings and Prompt Format:

Ive tested it up to 32k context using exl2 using these settings:

```
    "temp": 0.7,
    "temperature_last": true,
    "top_p": 1,
    "top_k": 0,
    "top_a": 0,
    "tfs": 1,
    "epsilon_cutoff": 0,
    "eta_cutoff": 0,
    "typical_p": 1,
    "min_p": 0.1,
    "rep_pen": 1.1,
    "rep_pen_range": 8192,
    "no_repeat_ngram_size": 0,
    "penalty_alpha": 0,
    "num_beams": 1,
    "length_penalty": 1,
    "min_length": 0,
    "encoder_rep_pen": 1,
    "freq_pen": 0,
    "presence_pen": 0,
    "do_sample": true,
    "early_stopping": false,
    "add_bos_token": false,
    "truncation_length": 2048,
    "ban_eos_token": true,
    "skip_special_tokens": true,
    "streaming": true,
    "mirostat_mode": 0,
    "mirostat_tau": 5,
    "mirostat_eta": 0.1,
```

Prompt format that work well
```
ChatML & Alpaca
```

### Merge Method

This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using Merged-Etheria-55b as a base.

### Configuration

The following YAML configuration was used to produce this model:

```yaml

base_model: Merged-Etheria-55b
models:
  - model: Sacr-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
  - model: Merged-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
merge_method: dare_ties
tokenizer_source: union
parameters:
  int8_mask: true
dtype: bfloat16

```