FMixIA-FrankenMerge-9.5B-PT-9

A merged model using Passthrough layer concatenation, creating a frankenmerge model using mergekit.

Model Details

Configuration

slices:
  - sources:
    - model: ZeroXClem/Qwen2.5-7B-HomerCreative-Mix
      layer_range: [0, 28]
  - sources:
    - model: ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix
      layer_range: [0, 28]
merge_method: passthrough
dtype: bfloat16

Usage

This model can be used with the standard transformers library:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9")
tokenizer = AutoTokenizer.from_pretrained("Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9")
Downloads last month
7
Safetensors
Model size
14.1B params
Tensor type
BF16
·
Inference API
Unable to determine this model's library. Check the docs .