Commit
·
e2fd71b
1
Parent(s):
2924fa9
Update README.md
Browse files
README.md
CHANGED
@@ -122,6 +122,16 @@ The following hyperparameters were used during training:
|
|
122 |
- Datasets 1.18.4
|
123 |
- Tokenizers 0.11.6
|
124 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
125 |
## Funding
|
126 |
|
127 |
This model was created with the annotated dataset from the [NLPMedTerm project](http://www.lllf.uam.es/ESP/nlpmedterm_en.html), funded by InterTalentum UAM, Marie Skłodowska-Curie COFUND grant (2019-2021) (H2020 program, contract number 713366) and by the Computational Linguistics Chair from the Knowledge Engineering Institute (IIC-UAM).
|
|
|
122 |
- Datasets 1.18.4
|
123 |
- Tokenizers 0.11.6
|
124 |
|
125 |
+
## Environmental Impact
|
126 |
+
|
127 |
+
Carbon emissions are estimated with the [Machine Learning Impact calculator](https://mlco2.github.io/impact/#compute) by [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The carbon impact is estimated by specifying the hardware, runtime, cloud provider, and compute region.
|
128 |
+
|
129 |
+
- Hardware Type: 1 GPU 24 GB RTX 3090
|
130 |
+
- Time used: 4' (0.07 hours)
|
131 |
+
- Compute Region: Spain, Europe
|
132 |
+
- Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid): 0.01 kg eq. CO2
|
133 |
+
(Carbon offset: 0)
|
134 |
+
|
135 |
## Funding
|
136 |
|
137 |
This model was created with the annotated dataset from the [NLPMedTerm project](http://www.lllf.uam.es/ESP/nlpmedterm_en.html), funded by InterTalentum UAM, Marie Skłodowska-Curie COFUND grant (2019-2021) (H2020 program, contract number 713366) and by the Computational Linguistics Chair from the Knowledge Engineering Institute (IIC-UAM).
|