File size: 294 Bytes
77e34b7
 
 
1565323
 
77e34b7
 
50089e2
5a67bfe
 
1
2
3
4
5
6
7
8
9
10
---
datasets:
- Yukang/LongAlpaca-16k-length
tags:
- axolotl
---

This is an extended (16K) context version of LLaMA 3 8B (base, not instruct). Trained for five hours on 8x A6000 GPUs, using the `Yukang/LongAlpaca-16k-length` dataset.

`rope_theta` was set to `1000000.0`. Trained with Axolotl.