metadata
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: IKT_classifier_transport_ghg_best
results: []
widget:
- text: >-
Forestry, forestry and wildlife: "Unconditional Contribution In the
unconditional scenario, GHG emissions would be reduced by 27.56 Mt CO2e
(6.73%) below BAU in 2030 in the respective sectors. 26.3 Mt CO2e (95.4%)
of this emission reduction will be from the Energy sector while 0.64
(2.3%) and 0.6 (2.2%) Mt CO2e reduction will be from AFOLU (agriculture)
and waste sector respectively. There will be no reduction in the IPPU
sector. Conditional Contribution In the conditional scenario, GHG
emissions would be reduced by 61.9 Mt CO2e (15.12%) below BAU in 2030 in
the respective sectors."
example_title: GHG
- text: >-
"Key Long-Term Climate Actions Cleaner and greener vehicles on our roads
Singapore is working to enhance the overall carbon efficiency of our land
transport system through the large-scale adoption of green vehicles. By
2040, we aim to phase out internal combustion engine vehicles and have all
vehicles running on cleaner energy. We will introduce policies and
initiatives to encourage the adoption of EVs. The public sector itself
will take the lead and progressively procure and use cleaner vehicles."
example_title: NOT_GHG
- text: >-
"This includes installation of rooftop PV panels for electricity
generation, 5,300 solar water heaters, and expand the use of LED lighting
in residential sector by 2030. • Expanding on energy efficiency labels and
specifications for appliances programme, elimination of non-energy
efficient equipment, and raising awareness among consumers on purchasing
alternative energy efficient home appliances."
example_title: NEGATIVE
IKT_classifier_transport_ghg_best
This model is a fine-tuned version of sentence-transformers/all-mpnet-base-v2 on the GIZ/policy_qa_v0_1 dataset. It achieves the following results on the evaluation set:
- Loss: 0.5948
- Precision Macro: 0.8995
- Precision Weighted: 0.8712
- Recall Macro: 0.8177
- Recall Weighted: 0.8605
- F1-score: 0.8456
- Accuracy: 0.8605
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.900299287565753e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100.0
- num_epochs: 8
Training results
Training Loss | Epoch | Step | Validation Loss | Precision Macro | Precision Weighted | Recall Macro | Recall Weighted | F1-score | Accuracy |
---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 52 | 0.9196 | 0.5132 | 0.6619 | 0.5936 | 0.7674 | 0.5493 | 0.7674 |
No log | 2.0 | 104 | 0.4997 | 0.9079 | 0.8830 | 0.7807 | 0.8605 | 0.8112 | 0.8605 |
No log | 3.0 | 156 | 0.4113 | 0.7992 | 0.8372 | 0.7992 | 0.8372 | 0.7992 | 0.8372 |
No log | 4.0 | 208 | 0.3726 | 0.9186 | 0.8935 | 0.8713 | 0.8837 | 0.8898 | 0.8837 |
No log | 5.0 | 260 | 0.5869 | 0.8687 | 0.8312 | 0.7446 | 0.8140 | 0.7758 | 0.8140 |
No log | 6.0 | 312 | 0.5321 | 0.8463 | 0.8593 | 0.8168 | 0.8605 | 0.8293 | 0.8605 |
No log | 7.0 | 364 | 0.5608 | 0.9149 | 0.8907 | 0.8353 | 0.8837 | 0.8632 | 0.8837 |
No log | 8.0 | 416 | 0.5948 | 0.8995 | 0.8712 | 0.8177 | 0.8605 | 0.8456 | 0.8605 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3