huseinzol05
commited on
Commit
•
8fc7650
1
Parent(s):
5d1e1b6
Update README.md
Browse files
README.md
CHANGED
@@ -5,6 +5,8 @@ language:
|
|
5 |
|
6 |
# Full Parameter Finetuning 1B 32768 context length Llama2 on Malaysian text
|
7 |
|
8 |
-
|
9 |
|
10 |
-
|
|
|
|
|
|
5 |
|
6 |
# Full Parameter Finetuning 1B 32768 context length Llama2 on Malaysian text
|
7 |
|
8 |
+
1B derived from first 4 layers 7B model.
|
9 |
|
10 |
+
README at https://github.com/mesolitica/malaya/tree/5.1/session/llama2#1b-32768-context-length-flash-attention-2
|
11 |
+
|
12 |
+
WandB, https://wandb.ai/mesolitica/fpf-Llama-2-1b-32k-hf
|