Edit model card

Model-card-peft-lora

Model Card for Na0s/Llama-3.1-8B-Pruned-4-Layers_LoRA-PEFT

Model Details

Model Description

  • Finetuned from model:[Na0s/Llama-3.1-8b-Pruned-4-Layers]

Training Details

    LoRA BF16, 
    batch_size=2, 
    steps=10000, gradient_accumulation_steps = 4,
    warmup_steps = 5,
    max_steps = 10000
    learning_rate = 2e-4,
    fp16 = not is_bfloat16_supported(),
    bf16 = is_bfloat16_supported(),
    logging_steps = 1,
    optim = "adamw_8bit",
    weight_decay = 0.01,
    lr_scheduler_type = "linear",
    seed = 3407

Training Data

[Open-Orca/SlimOrca]

Evaluation

MMLU Pro 0-shot: 0.2937

Evaluation Data

[TIGER-AI-Lab/MMLU-Pro]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

Downloads last month
19
Safetensors
Model size
6.94B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Na0s/Llama-3.1-8B-Pruned-4-Layers_LoRA-PEFT

Finetuned
(1)
this model
Finetunes
1 model

Dataset used to train Na0s/Llama-3.1-8B-Pruned-4-Layers_LoRA-PEFT

Collection including Na0s/Llama-3.1-8B-Pruned-4-Layers_LoRA-PEFT