File size: 460 Bytes
d5a7e80
 
3037b2a
 
 
 
 
 
425bd52
 
d5a7e80
3037b2a
 
 
4a12161
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: other
language:
- en
tags:
- llama
- fine tune
- light novel
- eminence in shadow
- konosuba
---

This repo is my fine tuned lora of Llama on the first 4 volumes of Eminence in shadow and konosuba to test its ability to record new information. 
The training used alpaca-lora on a 3090 for 10 hours with :
- Micro Batch Size 2,
- batch size 64,
- 35 epochs,
- 3e-4 learning rate,
- lora rank 256,
- 512 lora alpha,
- 0.05 lora dropout,
- 352 cutoff