Edit model card

gemma-7b-alpaca-it-ties

gemma-7b-alpaca-it-ties is a merge of the following models using mergekit:

🧩 Configuration

models:
  - model: google/gemma-7b-it
    parameters:
      density: 0.5
      weight: 0.5
  - model: mlabonne/Gemmalpaca-7B
    parameters:
      density: 0.5
      weight: 0.5 # weight gradient
merge_method: ties
base_model: google/gemma-7b
parameters:
  normalize: true
  int8_mask: true
dtype: float16
Downloads last month
1
Safetensors
Model size
8.54B params
Tensor type
FP16
·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Collection including arcee-ai/gemma-7b-alpaca-it-ties