File size: 2,062 Bytes
211d86c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 |
---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
library_name: transformers
pipeline_tag: text-generation
license: llama3.2
base_model: NousResearch/Llama-3.2-1B
tags:
- generated_from_trainer
- facebook
- meta
- pytorch
- llama
- llama-3
model-index:
- name: llama3.2-1b-synthia-i
results: []
---
# Llama 3.2 1B - Synthia-v1.5-I - Redmond - Fine-tuned Model
This model is a fine-tuned version of [NousResearch/Llama-3.2-1B](https://huggingface.co/NousResearch/Llama-3.2-1B) on the [Synthia-v1.5-I](https://huggingface.co/datasets/migtissera/Synthia-v1.5-I) dataset.
Thanks [RedmondAI](https://redmond.ai) for all the GPU Support!
## Model Description
The base model is Llama 3.2 1B, a multilingual large language model developed by Meta. This version has been fine-tuned on the Synthia-v1.5-I instruction dataset to improve its instruction-following capabilities.
### Training Data
The model was fine-tuned on Synthia-v1.5-I, which contains:
- 20.7k training examples
### Training Procedure
The model was trained with the following hyperparameters:
- Learning rate: 2e-05
- Train batch size: 1
- Eval batch size: 1
- Seed: 42
- Gradient accumulation steps: 8
- Total train batch size: 8
- Optimizer: Paged AdamW 8bit (betas=(0.9,0.999), epsilon=1e-08)
- LR scheduler: Cosine with 100 warmup steps
- Number of epochs: 3
### Framework Versions
- Transformers 4.46.1
- Pytorch 2.3.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.3
## Intended Use
This model is intended for:
- Instruction following tasks
- Conversational AI applications
- Research and development in natural language processing
## Training Infrastructure
The model was trained using the Axolotl framework version 0.5.0.
## License
This model is subject to the Llama 3.2 Community License Agreement. Users must comply with all terms and conditions specified in the license.
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
|