Undi95's picture
Update README.md
a0d04c9 verified
metadata
license: cc-by-nc-4.0
base_model:
  - vicgalle/Roleplay-Llama-3-8B
  - Undi95/Llama-3-Unholy-8B-e4
  - Undi95/Llama-3-LewdPlay-8B
library_name: transformers
tags:
  - mergekit
  - merge

LewdPlay-8B

May 1st 2024: GGUF have been fixed with this PR of llama.cpp

This is a merge of pre-trained language models created using mergekit.

The new EVOLVE merge method was used (on MMLU specifically), see below for more information!

Unholy was used for uncensoring, Roleplay Llama 3 for the DPO train he got on top, and LewdPlay for the... lewd side.

Prompt template: Llama3

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>

{input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>

{output}<|eot_id|>

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using ./mergekit/input_models/Roleplay-Llama-3-8B_213413727 as a base.

Models Merged

The following models were included in the merge:

  • ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
  • ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066

Configuration

The following YAML configuration was used to produce this model:

base_model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 0.0
slices:
- sources:
  - layer_range: [0, 4]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 1.0
      weight: 0.6861808716092435
  - layer_range: [0, 4]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.6628290134113985
      weight: 0.5815923052193855
  - layer_range: [0, 4]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 1.0
      weight: 0.5113886163963061
- sources:
  - layer_range: [4, 8]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 0.892655547455918
      weight: 0.038732602391021484
  - layer_range: [4, 8]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 1.0
      weight: 0.1982145486303527
  - layer_range: [4, 8]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 1.0
      weight: 0.6843011350690802
- sources:
  - layer_range: [8, 12]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 0.7817511027396784
      weight: 0.13053333213489704
  - layer_range: [8, 12]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.6963703515864826
      weight: 0.20525481492667985
  - layer_range: [8, 12]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 0.6983086326765777
      weight: 0.5843953969574106
- sources:
  - layer_range: [12, 16]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 0.9632895768462915
      weight: 0.2101146706607748
  - layer_range: [12, 16]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.597557434542081
      weight: 0.6728172621848589
  - layer_range: [12, 16]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 0.756263557607837
      weight: 0.2581423726361908
- sources:
  - layer_range: [16, 20]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 1.0
      weight: 0.2116035543552448
  - layer_range: [16, 20]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 1.0
      weight: 0.22654226422958418
  - layer_range: [16, 20]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 0.8925914810507647
      weight: 0.42243766315440867
- sources:
  - layer_range: [20, 24]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 0.7697608089825734
      weight: 0.1535118632140203
  - layer_range: [20, 24]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.9886758076773643
      weight: 0.3305040603868546
  - layer_range: [20, 24]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 1.0
      weight: 0.40670083428654535
- sources:
  - layer_range: [24, 28]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 1.0
      weight: 0.4542810478500622
  - layer_range: [24, 28]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.8330662483310117
      weight: 0.2587495367324508
  - layer_range: [24, 28]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 0.9845313983551542
      weight: 0.40378452705975915
- sources:
  - layer_range: [28, 32]
    model: ./mergekit/input_models/Llama-3-LewdPlay-8B-e3_2981937066
    parameters:
      density: 1.0
      weight: 0.2951962192288415
  - layer_range: [28, 32]
    model: ./mergekit/input_models/Llama-3-Unholy-8B-e4_1440388923
    parameters:
      density: 0.960315594933433
      weight: 0.13142971773782525
  - layer_range: [28, 32]
    model: ./mergekit/input_models/Roleplay-Llama-3-8B_213413727
    parameters:
      density: 1.0
      weight: 0.30838472094518804

Support

If you want to support me, you can here.