File size: 385 Bytes
b0921ef |
1 2 3 4 5 6 7 8 9 10 |
---
datasets:
- EleutherAI/wikitext_document_level
tags:
- llama
---
LLaMA 33b finetuned on `wikitext_document_level` with a linear ROPE scaling of 8, for a 16k token context length.
This is a merged version of [llama33b-16k-qlora](https://huggingface.co/chargoddard/llama33b-16k-qlora).
Note that this is *not* an instruct model - this is base LLaMA with an extended sequence length. |