publichealthsurveillance
commited on
Commit
•
31615ed
1
Parent(s):
d1a00fc
Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ Each sequence of BERT LM inputs is converted to 50,265 vocabulary tokens. Twitte
|
|
23 |
We used the embedding of the special token [CLS] of the last hidden layer as the final feature of the input text. We adopted the multilayer perceptron (MLP) with the hyperbolic tangent activation function and used Adam optimizer. The models are trained with a one cycle policy at a maximum learning rate of 2e-05 with momentum cycled between 0.85 and 0.95.
|
24 |
|
25 |
## Societal Impact
|
26 |
-
We train and release a PLM to accelerate the automatic identification of tasks
|
27 |
|
28 |
## BibTex entry and citation info
|
29 |
```
|
|
|
23 |
We used the embedding of the special token [CLS] of the last hidden layer as the final feature of the input text. We adopted the multilayer perceptron (MLP) with the hyperbolic tangent activation function and used Adam optimizer. The models are trained with a one cycle policy at a maximum learning rate of 2e-05 with momentum cycled between 0.85 and 0.95.
|
24 |
|
25 |
## Societal Impact
|
26 |
+
We train and release a PLM to accelerate the automatic identification of tasks related to PHS on social media. Our work aims to develop a new computational method for screening users in need of early intervention and is not intended to use in clinical settings or as a diagnostic tool.
|
27 |
|
28 |
## BibTex entry and citation info
|
29 |
```
|