roberta-large-finetuned-steam-reviews
This model is a fine-tuned version of roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2560
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.7565 | 0.2379 | 800 | 1.5387 |
1.6034 | 0.4758 | 1600 | 1.5048 |
1.5793 | 0.7136 | 2400 | 1.4671 |
1.5377 | 0.9515 | 3200 | 1.4388 |
1.5018 | 1.1894 | 4000 | 1.4363 |
1.4893 | 1.4273 | 4800 | 1.4168 |
1.4774 | 1.6652 | 5600 | 1.4002 |
1.4705 | 1.9031 | 6400 | 1.3901 |
1.4409 | 2.1409 | 7200 | 1.3821 |
1.4352 | 2.3788 | 8000 | 1.3821 |
1.4314 | 2.6167 | 8800 | 1.3848 |
1.4178 | 2.8546 | 9600 | 1.3745 |
1.4118 | 3.0925 | 10400 | 1.3683 |
1.3876 | 3.3304 | 11200 | 1.3423 |
1.4027 | 3.5682 | 12000 | 1.3441 |
1.3969 | 3.8061 | 12800 | 1.3571 |
1.3801 | 4.0440 | 13600 | 1.3552 |
1.3702 | 4.2819 | 14400 | 1.3220 |
1.3568 | 4.5198 | 15200 | 1.3181 |
1.3628 | 4.7577 | 16000 | 1.3268 |
1.3586 | 4.9955 | 16800 | 1.3256 |
1.3384 | 5.2334 | 17600 | 1.3316 |
1.3389 | 5.4713 | 18400 | 1.3142 |
1.3245 | 5.7092 | 19200 | 1.3105 |
1.3306 | 5.9471 | 20000 | 1.3171 |
1.3201 | 6.1850 | 20800 | 1.3080 |
1.3118 | 6.4228 | 21600 | 1.3021 |
1.3218 | 6.6607 | 22400 | 1.2908 |
1.3181 | 6.8986 | 23200 | 1.2886 |
1.2915 | 7.1365 | 24000 | 1.2868 |
1.2893 | 7.3744 | 24800 | 1.2856 |
1.3001 | 7.6123 | 25600 | 1.2904 |
1.2823 | 7.8501 | 26400 | 1.2806 |
1.276 | 8.0880 | 27200 | 1.3013 |
1.2732 | 8.3259 | 28000 | 1.2950 |
1.2825 | 8.5638 | 28800 | 1.2767 |
1.2811 | 8.8017 | 29600 | 1.2781 |
1.2746 | 9.0395 | 30400 | 1.2740 |
1.2683 | 9.2774 | 31200 | 1.2721 |
1.2705 | 9.5153 | 32000 | 1.2616 |
1.2531 | 9.7532 | 32800 | 1.2598 |
1.2628 | 9.9911 | 33600 | 1.2614 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.1.0+cu118
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 42
Model tree for xingchenc/roberta-large-finetuned-steam-reviews
Base model
FacebookAI/roberta-large