File size: 1,079 Bytes
9db4394
 
 
b7c077c
 
 
 
 
 
 
 
 
 
 
 
47255fc
b7c077c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47255fc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
license: openrail
---
Experimental Tagalog loras: safe or accurate outputs not guaranteed (not for production use)!

# lt2_08162023
* Fine tuned on a small dataset of 14 items, manually edited
* 1 epoch (barely any noticable results)
* From chat LLaMA-2-7b
* Lora of chat-tagalog v0.1

# lt2_08162023a
* Fine tuned on a small dataset of 14 items, manually edited
* 20 epochs (more observable effects)
* From chat LLaMA-2-7b
* Lora of [chat-tagalog v0.1a](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.1a)

# lt2_08162023b
* Fine tuned on a small dataset of 14 items, manually edited
* 10 epochs
* From chat LLaMA-2-7b
* Lora of chat-tagalog v0.1b

# lt2_08162023c
* Fine tuned on a small dataset of 14 items, manually edited
* 50 epochs (overfitted)
* From chat LLaMA-2-7b
* Lora of chat-tagalog v0.1c

# lt2_08162023d
* Fine tuned on a small dataset of 14 items, manually edited
* 30 epochs (v0.1a further trained and cut-off before overfit)
* From chat LLaMA-2-7b
* Lora of [chat-tagalog v0.1d](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.1d)