BigOrca-2-XB / README.md
Joseph717171's picture
Update README.md
3a2a334 verified
metadata
pipeline_tag: text-generation
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge
  - orca
  - orca2
  - microsoft
license_name: microsoft-research-license
license_link: LICENSE
license: other

Inspired by AbucusAI's BigYi-15b...

BigOrca-2-XB

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

  • /Users/jsarnecki/opt/microsoft-Orca-2-13b

Quantizations

GGUFS (Thanks to mradermacher)

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 10]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [5, 15]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [10, 20]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [15, 25]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [20, 30]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [25, 35]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b
- sources:
  - layer_range: [30, 40]
    model: /Users/jsarnecki/opt/microsoft-Orca-2-13b