Add large model link
Browse files
README.md
CHANGED
@@ -10,7 +10,8 @@ datasets:
|
|
10 |
|
11 |
# Twitter 2022 154M (RoBERTa-base, 154M - full update)
|
12 |
|
13 |
-
This is a RoBERTa-base model trained on 154M tweets until the end of December 2022 (from original checkpoint, no incremental updates).
|
|
|
14 |
|
15 |
These 154M tweets result from filtering 220M tweets obtained exclusively from the Twitter Academic API, covering every month between 2018-01 and 2022-12.
|
16 |
Filtering and preprocessing details are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
|
|
|
10 |
|
11 |
# Twitter 2022 154M (RoBERTa-base, 154M - full update)
|
12 |
|
13 |
+
This is a RoBERTa-base model trained on 154M tweets until the end of December 2022 (from original checkpoint, no incremental updates).
|
14 |
+
A large model trained on the same data is available [here](https://huggingface.co/cardiffnlp/twitter-roberta-large-2022-154m).
|
15 |
|
16 |
These 154M tweets result from filtering 220M tweets obtained exclusively from the Twitter Academic API, covering every month between 2018-01 and 2022-12.
|
17 |
Filtering and preprocessing details are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
|