output-model-directory

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
    - model: 1TuanPham/InstructEnVi_llama2-bkai-120GB_250kx2e
      layer_range: [0, 14]
  - sources:
    - model: 1TuanPham/InstructEnVi_llama2-bkai-120GB_250kx2e
      layer_range: [10, 20]
  - sources:
    - model: 1TuanPham/InstructEnVi_llama2-bkai-120GB_250kx2e
      layer_range: [16, 32]
merge_method: passthrough
dtype: float16
Downloads last month
62
Safetensors
Model size
8.47B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for 1TuanPham/InstructEnVi_llama2-bkai-120GB_250kx2e_Frankenx3

Finetuned
(1)
this model
Quantizations
1 model