yoshitomo-matsubara commited on
Commit
fed1047
1 Parent(s): 6c2fde0

initial commit

Browse files
README.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - bert
5
+ - wnli
6
+ - glue
7
+ - torchdistill
8
+ license: apache-2.0
9
+ datasets:
10
+ - wnli
11
+ metrics:
12
+ - accuracy
13
+ ---
14
+
15
+ `bert-base-uncased` fine-tuned on WNLI dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
16
+ The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/wnli/ce/bert_base_uncased.yaml).
17
+ I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **77.9**.
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "finetuning_task": "wnli",
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "problem_type": "single_label_classification",
22
+ "transformers_version": "4.6.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:379139c76816a5099d83e7f12667fb0676a480fee8d5de12ca772907523dd082
3
+ size 438024457
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
training.log ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2021-05-29 15:45:57,389 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/wnli/ce/bert_base_uncased.yaml', log='log/glue/wnli/ce/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='wnli', test_only=False, world_size=1)
2
+ 2021-05-29 15:45:57,468 INFO __main__ Distributed environment: NO
3
+ Num processes: 1
4
+ Process index: 0
5
+ Local process index: 0
6
+ Device: cuda
7
+ Use FP16 precision: True
8
+
9
+ 2021-05-29 15:45:57,839 INFO filelock Lock 140008728159440 acquired on /root/.cache/huggingface/transformers/3c61d016573b14f7f008c02c4e51a366c67ab274726fe2910691e2a761acf43e.37395cee442ab11005bcd270f3c34464dc1704b715b5d7d52b1a461abe3b9e4e.lock
10
+ 2021-05-29 15:45:58,198 INFO filelock Lock 140008728159440 released on /root/.cache/huggingface/transformers/3c61d016573b14f7f008c02c4e51a366c67ab274726fe2910691e2a761acf43e.37395cee442ab11005bcd270f3c34464dc1704b715b5d7d52b1a461abe3b9e4e.lock
11
+ 2021-05-29 15:45:58,909 INFO filelock Lock 140008728159440 acquired on /root/.cache/huggingface/transformers/45c3f7a79a80e1cf0a489e5c62b43f173c15db47864303a55d623bb3c96f72a5.d789d64ebfe299b0e416afc4a169632f903f693095b4629a7ea271d5a0cf2c99.lock
12
+ 2021-05-29 15:45:59,428 INFO filelock Lock 140008728159440 released on /root/.cache/huggingface/transformers/45c3f7a79a80e1cf0a489e5c62b43f173c15db47864303a55d623bb3c96f72a5.d789d64ebfe299b0e416afc4a169632f903f693095b4629a7ea271d5a0cf2c99.lock
13
+ 2021-05-29 15:45:59,782 INFO filelock Lock 140008689948496 acquired on /root/.cache/huggingface/transformers/534479488c54aeaf9c3406f647aa2ec13648c06771ffe269edabebd4c412da1d.7f2721073f19841be16f41b0a70b600ca6b880c8f3df6f3535cbc704371bdfa4.lock
14
+ 2021-05-29 15:46:00,315 INFO filelock Lock 140008689948496 released on /root/.cache/huggingface/transformers/534479488c54aeaf9c3406f647aa2ec13648c06771ffe269edabebd4c412da1d.7f2721073f19841be16f41b0a70b600ca6b880c8f3df6f3535cbc704371bdfa4.lock
15
+ 2021-05-29 15:46:01,369 INFO filelock Lock 140008689947152 acquired on /root/.cache/huggingface/transformers/c1d7f0a763fb63861cc08553866f1fc3e5a6f4f07621be277452d26d71303b7e.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
16
+ 2021-05-29 15:46:01,723 INFO filelock Lock 140008689947152 released on /root/.cache/huggingface/transformers/c1d7f0a763fb63861cc08553866f1fc3e5a6f4f07621be277452d26d71303b7e.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
17
+ 2021-05-29 15:46:02,102 INFO filelock Lock 140008689532752 acquired on /root/.cache/huggingface/transformers/a8041bf617d7f94ea26d15e218abd04afc2004805632abc0ed2066aa16d50d04.faf6ea826ae9c5867d12b22257f9877e6b8367890837bd60f7c54a29633f7f2f.lock
18
+ 2021-05-29 15:46:13,325 INFO filelock Lock 140008689532752 released on /root/.cache/huggingface/transformers/a8041bf617d7f94ea26d15e218abd04afc2004805632abc0ed2066aa16d50d04.faf6ea826ae9c5867d12b22257f9877e6b8367890837bd60f7c54a29633f7f2f.lock
19
+ 2021-05-29 15:46:19,041 INFO __main__ Start training
20
+ 2021-05-29 15:46:19,041 INFO torchdistill.models.util [student model]
21
+ 2021-05-29 15:46:19,041 INFO torchdistill.models.util Using the original student model
22
+ 2021-05-29 15:46:19,042 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
23
+ 2021-05-29 15:46:25,885 INFO torchdistill.misc.log Epoch: [0] [ 0/20] eta: 0:00:07 lr: 2.97e-05 sample/s: 11.322545660444082 loss: 0.6760 (0.6760) time: 0.3781 data: 0.0248 max mem: 2057
24
+ 2021-05-29 15:46:31,412 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:05
25
+ 2021-05-29 15:46:31,652 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
26
+ 2021-05-29 15:46:31,652 INFO __main__ Validation: accuracy = 0.43661971830985913
27
+ 2021-05-29 15:46:31,652 INFO __main__ Updating ckpt at ./resource/ckpt/glue/wnli/ce/wnli-bert-base-uncased
28
+ 2021-05-29 15:46:33,223 INFO torchdistill.misc.log Epoch: [1] [ 0/20] eta: 0:00:05 lr: 2.37e-05 sample/s: 14.398164143995235 loss: 0.6930 (0.6930) time: 0.2852 data: 0.0074 max mem: 4061
29
+ 2021-05-29 15:46:38,800 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:05
30
+ 2021-05-29 15:46:39,034 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
31
+ 2021-05-29 15:46:39,034 INFO __main__ Validation: accuracy = 0.49295774647887325
32
+ 2021-05-29 15:46:39,034 INFO __main__ Updating ckpt at ./resource/ckpt/glue/wnli/ce/wnli-bert-base-uncased
33
+ 2021-05-29 15:46:40,617 INFO torchdistill.misc.log Epoch: [2] [ 0/20] eta: 0:00:06 lr: 1.77e-05 sample/s: 12.36997828627464 loss: 0.6935 (0.6935) time: 0.3296 data: 0.0062 max mem: 4062
34
+ 2021-05-29 15:46:46,241 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:05
35
+ 2021-05-29 15:46:46,476 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
36
+ 2021-05-29 15:46:46,476 INFO __main__ Validation: accuracy = 0.4225352112676056
37
+ 2021-05-29 15:46:46,756 INFO torchdistill.misc.log Epoch: [3] [ 0/20] eta: 0:00:05 lr: 1.1700000000000001e-05 sample/s: 14.60795002868962 loss: 0.6947 (0.6947) time: 0.2787 data: 0.0049 max mem: 4062
38
+ 2021-05-29 15:46:52,374 INFO torchdistill.misc.log Epoch: [3] Total time: 0:00:05
39
+ 2021-05-29 15:46:52,611 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
40
+ 2021-05-29 15:46:52,612 INFO __main__ Validation: accuracy = 0.5915492957746479
41
+ 2021-05-29 15:46:52,612 INFO __main__ Updating ckpt at ./resource/ckpt/glue/wnli/ce/wnli-bert-base-uncased
42
+ 2021-05-29 15:46:54,220 INFO torchdistill.misc.log Epoch: [4] [ 0/20] eta: 0:00:06 lr: 5.7000000000000005e-06 sample/s: 12.286671143721517 loss: 0.6878 (0.6878) time: 0.3314 data: 0.0058 max mem: 4062
43
+ 2021-05-29 15:46:59,858 INFO torchdistill.misc.log Epoch: [4] Total time: 0:00:05
44
+ 2021-05-29 15:47:00,093 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
45
+ 2021-05-29 15:47:00,094 INFO __main__ Validation: accuracy = 0.5915492957746479
46
+ 2021-05-29 15:47:04,186 INFO __main__ [Student: bert-base-uncased]
47
+ 2021-05-29 15:47:04,431 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
48
+ 2021-05-29 15:47:04,431 INFO __main__ Test: accuracy = 0.5915492957746479
49
+ 2021-05-29 15:47:04,431 INFO __main__ Start prediction for private dataset(s)
50
+ 2021-05-29 15:47:04,432 INFO __main__ wnli/test: 146 samples
vocab.txt ADDED
The diff for this file is too large to render. See raw diff