publichealthsurveillance
commited on
Commit
•
6eff5e5
1
Parent(s):
7bc6bf0
Update README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,7 @@
|
|
3 |
We present and release [PHS-BERT](https://arxiv.org/abs/2204.04521), a transformer-based pretrained language model (PLM), to identify tasks related to public health surveillance (PHS) on social media. Compared with existing PLMs that are mainly evaluated on limited tasks, PHS-BERT achieved state-of-the-art performance on 25 tested datasets, showing that our PLM is robust and generalizable in common PHS tasks.
|
4 |
|
5 |
## Usage
|
6 |
-
Load the model via [Huggingface's Transformers library](https://github.com/huggingface/transformers])
|
7 |
```
|
8 |
from transformers import AutoTokenizer, AutoModel
|
9 |
tokenizer = AutoTokenizer.from_pretrained("publichealthsurveillance/PHS-BERT")
|
@@ -26,6 +26,8 @@ We used the embedding of the special token [CLS] of the last hidden layer as the
|
|
26 |
We train and release a PLM to accelerate the automatic identification of tasks related to PHS on social media. Our work aims to develop a new computational method for screening users in need of early intervention and is not intended to use in clinical settings or as a diagnostic tool.
|
27 |
|
28 |
## BibTex entry and citation info
|
|
|
|
|
29 |
```
|
30 |
@misc{https://doi.org/10.48550/arxiv.2204.04521,
|
31 |
doi = {10.48550/ARXIV.2204.04521},
|
|
|
3 |
We present and release [PHS-BERT](https://arxiv.org/abs/2204.04521), a transformer-based pretrained language model (PLM), to identify tasks related to public health surveillance (PHS) on social media. Compared with existing PLMs that are mainly evaluated on limited tasks, PHS-BERT achieved state-of-the-art performance on 25 tested datasets, showing that our PLM is robust and generalizable in common PHS tasks.
|
4 |
|
5 |
## Usage
|
6 |
+
Load the model via [Huggingface's Transformers library](https://github.com/huggingface/transformers]):
|
7 |
```
|
8 |
from transformers import AutoTokenizer, AutoModel
|
9 |
tokenizer = AutoTokenizer.from_pretrained("publichealthsurveillance/PHS-BERT")
|
|
|
26 |
We train and release a PLM to accelerate the automatic identification of tasks related to PHS on social media. Our work aims to develop a new computational method for screening users in need of early intervention and is not intended to use in clinical settings or as a diagnostic tool.
|
27 |
|
28 |
## BibTex entry and citation info
|
29 |
+
For more details, refer to the paper [Benchmarking for Public Health Surveillance tasks on Social Media with a Domain-Specific Pretrained Language Model](https://arxiv.org/abs/2204.04521).
|
30 |
+
|
31 |
```
|
32 |
@misc{https://doi.org/10.48550/arxiv.2204.04521,
|
33 |
doi = {10.48550/ARXIV.2204.04521},
|