Edit model card

Finch

Finch 7b Merge

A SLERP merge of my two current fav 7B models

macadeliccc/WestLake-7B-v2-laser-truthy-dpo & SanjiWatsuki/Kunoichi-DPO-v2-7B

A 6bpw EXL2 quant of Finch

I'm open to doing others quants, just ask.

Settings

I reccomend using the ChatML format. As for samplers, I reccomend the following:

Temperature: 1.2
Min P: 0.2
Smoothing Factor: 0.2

Mergekit Config

base_model: macadeliccc/WestLake-7B-v2-laser-truthy-dpo
dtype: float16
merge_method: slerp
parameters:
  t:
  - filter: self_attn
    value: [0.0, 0.5, 0.3, 0.7, 1.0]
  - filter: mlp
    value: [1.0, 0.5, 0.7, 0.3, 0.0]
  - value: 0.5
slices:
- sources:
  - layer_range: [0, 32]
    model: macadeliccc/WestLake-7B-v2-laser-truthy-dpo
  - layer_range: [0, 32]
    model: SanjiWatsuki/Kunoichi-DPO-v2-7B
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for antiven0m/finch-6bpw-exl2

Collection including antiven0m/finch-6bpw-exl2