Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- EleutherAI/pile
|
4 |
+
language:
|
5 |
+
- en
|
6 |
+
tags:
|
7 |
+
- t5x
|
8 |
+
- encoder-decoder
|
9 |
+
---
|
10 |
+
|
11 |
+
This is the T5x version of Pile-T5 Large. You can use these checkpoints to continue pretraining or finetune using the [T5x](https://github.com/google-research/t5x) library.
|
12 |
+
Scripts used to train Pile-T5 are available in the [improved-t5 repository](https://github.com/EleutherAI/improved-t5) on github.
|
13 |
+
|
14 |
+
For the HF version, please refer [here](https://huggingface.co/EleutherAI/pile-t5-large)
|
15 |
+
|
16 |
+
### BibTeX
|
17 |
+
|
18 |
+
```
|
19 |
+
@misc{2024PileT5,
|
20 |
+
author = {Lintang Sutawika and Aran Komatsuzaki and Colin Raffel},
|
21 |
+
title = {Pile-T5},
|
22 |
+
year = {2024},
|
23 |
+
url = {https://blog.eleuther.ai/pile-t5/},
|
24 |
+
note = {Blog post},
|
25 |
+
}
|
26 |
+
```
|