PEFT
code
instruct
mistral
Files changed (1) hide show
  1. README.md +53 -1
README.md CHANGED
@@ -19,4 +19,56 @@ The following `bitsandbytes` quantization config was used during training:
19
  ### Framework versions
20
 
21
 
22
- - PEFT 0.5.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
  ### Framework versions
20
 
21
 
22
+ - PEFT 0.5.0
23
+
24
+ - ---
25
+ library_name: peft
26
+ tags:
27
+ - code
28
+ - instruct
29
+ - gpt2
30
+ datasets:
31
+ - HuggingFaceH4/no_robots
32
+ base_model: gpt2
33
+ license: apache-2.0
34
+ ---
35
+
36
+ ### Finetuning Overview:
37
+
38
+ **Model Used:** gpt2
39
+
40
+ **Dataset:** HuggingFaceH4/no_robots
41
+
42
+ #### Dataset Insights:
43
+
44
+ [No Robots](https://huggingface.co/datasets/HuggingFaceH4/no_robots) is a high-quality dataset of 10,000 instructions and demonstrations created by skilled human annotators. This data can be used for supervised fine-tuning (SFT) to make language models follow instructions better.
45
+
46
+ #### Finetuning Details:
47
+
48
+ With the utilization of [MonsterAPI](https://monsterapi.ai)'s [LLM finetuner](https://docs.monsterapi.ai/fine-tune-a-large-language-model-llm), this finetuning:
49
+
50
+ - Was achieved with great cost-effectiveness.
51
+ - Completed in a total duration of 3mins 40s for 1 epoch using an A6000 48GB GPU.
52
+ - Costed `$0.101` for the entire epoch.
53
+
54
+ #### Hyperparameters & Additional Details:
55
+
56
+ - **Epochs:** 1
57
+ - **Cost Per Epoch:** $0.101
58
+ - **Total Finetuning Cost:** $0.101
59
+ - **Model Path:** gpt2
60
+ - **Learning Rate:** 0.0002
61
+ - **Data Split:** 100% train
62
+ - **Gradient Accumulation Steps:** 4
63
+ - **lora r:** 32
64
+ - **lora alpha:** 64
65
+
66
+ #### Prompt Structure
67
+ ```
68
+ <|system|> <|endoftext|> <|user|> [USER PROMPT]<|endoftext|> <|assistant|> [ASSISTANT ANSWER] <|endoftext|>
69
+ ```
70
+ #### Training loss :
71
+
72
+ ![training loss](https://cdn-uploads.huggingface.co/production/uploads/63ba46aa0a9866b28cb19a14/9bgb518kFwtDsFtrHzmTu.png)
73
+
74
+ license: apache-2.0