YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
Phi-5B-Test - bnb 4bits
- Model creator: https://huggingface.co/Replete-AI/
- Original model: https://huggingface.co/Replete-AI/Phi-5B-Test/
Original model description:
base_model: [] tags: - mergekit - merge license: mit
Untitled Model (1)
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
and then depth upscaled
Configuration
The following YAML configuration was used to produce this model:
models:
- model: liminerity/Phigments12
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: l3utterfly/phi-2-layla-v1-chatml
parameters:
density: 0.8
weight: [0, 0.5, 0.7, 1] # weight gradient
merge_method: dare_ties
base_model: liminerity/Phigments12
parameters:
normalize: true
int8_mask: true
dtype: float16
dtype: float16
merge_method: passthrough
slices:
- sources:
- model: phi/
layer_range: [0,32]
- sources:
- model: phi/
layer_range: [0,32]
Join the Replete AI Discord here!
- Downloads last month
- 3