Edit model card

2020-Q2-75p-filtered

This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-2019-90m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0187

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1400
  • training_steps: 2400000

Training results

Training Loss Epoch Step Validation Loss
No log 0.07 8000 3.1520
3.3704 0.13 16000 3.1286
3.3704 0.2 24000 3.1079
3.2908 0.27 32000 3.0845
3.2908 0.34 40000 3.0868
3.2742 0.4 48000 3.0768
3.2742 0.47 56000 3.0706
3.2579 0.54 64000 3.0621
3.2579 0.61 72000 3.0659
3.2448 0.67 80000 3.0457
3.2448 0.74 88000 3.0554
3.2416 0.81 96000 3.0335
3.2416 0.88 104000 3.0321
3.23 0.94 112000 3.0137
3.23 1.01 120000 3.0061
3.2084 1.08 128000 3.0251
3.2084 1.15 136000 3.0092
3.2055 1.21 144000 3.0043
3.2055 1.28 152000 3.0055
3.2026 1.35 160000 3.0066
3.2026 1.41 168000 3.0125
3.2069 1.48 176000 3.0032
3.2069 1.55 184000 2.9959
3.1904 1.62 192000 2.9960
3.1904 1.68 200000 3.0038
3.1989 1.75 208000 3.0016
3.1989 1.82 216000 3.0049
3.2113 1.89 224000 3.0086
3.2113 1.95 232000 3.0104
3.217 2.02 240000 3.0166
3.217 2.09 248000 3.0139
3.2029 2.16 256000 3.0217
3.2029 2.22 264000 3.0238
3.2226 2.29 272000 3.0234
3.2226 2.36 280000 3.0216
3.2199 2.43 288000 3.0175
3.2199 2.49 296000 3.0365
3.2254 2.56 304000 3.0282
3.2254 2.63 312000 3.0228
3.2349 2.7 320000 3.0205
3.2349 2.76 328000 3.0406
3.2424 2.83 336000 3.0307
3.2424 2.9 344000 3.0413
3.2347 2.96 352000 3.0401
3.2347 3.03 360000 3.0520
3.2476 3.1 368000 3.0489
3.2476 3.17 376000 3.0521
3.2506 3.23 384000 3.0685
3.2506 3.3 392000 3.0546
3.2547 3.37 400000 3.0542
3.2547 3.44 408000 3.0537
3.2519 3.5 416000 3.0588
3.2519 3.57 424000 3.0729
3.2679 3.64 432000 3.0842
3.2679 3.71 440000 3.0685
3.2656 3.77 448000 3.0942
3.2656 3.84 456000 3.0942
3.2908 3.91 464000 3.0918
3.2908 3.98 472000 3.0922
3.2944 4.04 480000 3.1093
3.2944 4.11 488000 3.1158
3.2917 4.18 496000 3.0997
3.2917 4.24 504000 3.1111
3.2916 4.31 512000 3.1133
3.2916 4.38 520000 3.1129
3.2836 4.45 528000 3.1134
3.2836 4.51 536000 3.1058
3.3068 4.58 544000 3.1211
3.3068 4.65 552000 3.0946
3.3026 4.72 560000 3.1079
3.3026 4.78 568000 3.1202
3.3078 4.85 576000 3.1155
3.3078 4.92 584000 3.1254
3.3168 4.99 592000 3.1279
3.3168 5.05 600000 3.1179
3.3113 5.12 608000 3.1277
3.3113 5.19 616000 3.1334
3.3102 5.26 624000 3.1233
3.3102 5.32 632000 3.1274
3.3235 5.39 640000 3.1434
3.3235 5.46 648000 3.1368
3.331 5.53 656000 3.1591
3.331 5.59 664000 3.1546
3.3308 5.66 672000 3.1663
3.3308 5.73 680000 3.1535
3.3396 5.79 688000 3.1558
3.3396 5.86 696000 3.1698
3.3558 5.93 704000 3.1651
3.3558 6.0 712000 3.1706
3.3474 6.06 720000 3.1942
3.3474 6.13 728000 3.1705
3.3513 6.2 736000 3.1834
3.3513 6.27 744000 3.1810
3.362 6.33 752000 3.1723
3.362 6.4 760000 3.1827
3.3694 6.47 768000 3.1937
3.3694 6.54 776000 3.2004
3.378 6.6 784000 3.2023
3.378 6.67 792000 3.1936
3.3703 6.74 800000 3.1948
3.3703 6.81 808000 3.2082
3.3838 6.87 816000 3.1974
3.3838 6.94 824000 3.2029
3.3871 7.01 832000 3.2160
3.3871 7.07 840000 3.2198
3.3839 7.14 848000 3.2190
3.3839 7.21 856000 3.2204
3.389 7.28 864000 3.2188
3.389 7.34 872000 3.2246
3.398 7.41 880000 3.2333
3.398 7.48 888000 3.2168
3.4001 7.55 896000 3.2311
3.4001 7.61 904000 3.2390
3.4255 7.68 912000 3.2447
3.4255 7.75 920000 3.2546
3.4218 7.82 928000 3.2510
3.4218 7.88 936000 3.2433
3.4326 7.95 944000 3.2509
3.4326 8.02 952000 3.2573
3.4268 8.09 960000 3.2499
3.4268 8.15 968000 3.2704
3.4165 8.22 976000 3.2579
3.4165 8.29 984000 3.2669
3.4425 8.36 992000 3.2723
3.4425 8.42 1000000 3.2718
3.4433 8.49 1008000 3.2655
3.4433 8.56 1016000 3.2794
3.4437 8.62 1024000 3.2808
3.4437 8.69 1032000 3.2731
3.4499 8.76 1040000 3.2785
3.4499 8.83 1048000 3.2823
3.4593 8.89 1056000 3.2844
3.4593 8.96 1064000 3.2877
3.4481 9.03 1072000 3.2969
3.4481 9.1 1080000 3.2870
3.4542 9.16 1088000 3.2946
3.4542 9.23 1096000 3.2901
3.4547 9.3 1104000 3.2813
3.4547 9.37 1112000 3.2910
3.4618 9.43 1120000 3.2978
3.4618 9.5 1128000 3.3055
3.46 9.57 1136000 3.2885
3.46 9.64 1144000 3.2871
3.4572 9.7 1152000 3.2905
3.4572 9.77 1160000 3.3006
3.4597 9.84 1168000 3.3081
3.4597 9.9 1176000 3.3031
3.4651 9.97 1184000 3.2883
3.4651 10.04 1192000 3.3189
3.4571 10.11 1200000 3.2978
3.4571 10.17 1208000 3.3091
3.4567 10.24 1216000 3.2755
3.4567 10.31 1224000 3.2968
3.4584 10.38 1232000 3.2991
3.4584 10.44 1240000 3.2818
3.4459 10.51 1248000 3.2823
3.4459 10.58 1256000 3.2800
3.4474 10.65 1264000 3.2856
3.4474 10.71 1272000 3.2845
3.4383 10.78 1280000 3.2804
3.4383 10.85 1288000 3.2707
3.4496 10.92 1296000 3.2824
3.4496 10.98 1304000 3.2765
3.4411 11.05 1312000 3.2838
3.4411 11.12 1320000 3.2839
3.4305 11.19 1328000 3.2748
3.4305 11.25 1336000 3.2821
3.4258 11.32 1344000 3.2746
3.4258 11.39 1352000 3.2861
3.4227 11.45 1360000 3.2710
3.4227 11.52 1368000 3.2788
3.4319 11.59 1376000 3.2794
3.4319 11.66 1384000 3.2766
3.436 11.72 1392000 3.2924
3.436 11.79 1400000 3.2812
3.4368 11.86 1408000 3.2851
3.4368 11.93 1416000 3.2822
3.4346 11.99 1424000 3.2657
3.4346 12.06 1432000 3.2748
3.4265 12.13 1440000 3.2685
3.4265 12.2 1448000 3.2947
3.4306 12.26 1456000 3.2841
3.4306 12.33 1464000 3.2748
3.4254 12.4 1472000 3.2794
3.4254 12.47 1480000 3.2774
3.4353 12.53 1488000 3.2726
3.4353 12.6 1496000 3.2763
3.4358 12.67 1504000 3.2659
3.4358 12.73 1512000 3.2710
3.4182 12.8 1520000 3.2777
3.4182 12.87 1528000 3.2824
3.4384 12.94 1536000 3.2887
3.4384 13.0 1544000 3.2667
3.4287 13.07 1552000 3.2713
3.4287 13.14 1560000 3.2640
3.4181 13.21 1568000 3.2607
3.4181 13.27 1576000 3.2643
3.4173 13.34 1584000 3.2630
3.4173 13.41 1592000 3.2572
3.4214 13.48 1600000 3.2728
3.4214 13.54 1608000 3.2822
3.4223 13.61 1616000 3.2704
3.4223 13.68 1624000 3.2634
3.417 13.75 1632000 3.2691
3.417 13.81 1640000 3.2550
3.4146 13.88 1648000 3.2529
3.4146 13.95 1656000 3.2713
3.4186 14.02 1664000 3.2672
3.4186 14.08 1672000 3.2542
3.4082 14.15 1680000 3.2576
3.4082 14.22 1688000 3.2680
3.4186 14.28 1696000 3.2667
3.4186 14.35 1704000 3.2694
3.4131 14.42 1712000 3.2606
3.4131 14.49 1720000 3.2622
3.4239 14.55 1728000 3.2678
3.4239 14.62 1736000 3.2708
3.4197 14.69 1744000 3.2622
3.4197 14.76 1752000 3.2605
3.4073 14.82 1760000 3.2647
3.4073 14.89 1768000 3.2619
3.4167 14.96 1776000 3.2816
3.4167 15.03 1784000 3.2603
3.413 15.09 1792000 3.2661
3.413 15.16 1800000 3.2589
3.4117 15.23 1808000 3.2688
3.4117 15.3 1816000 3.2678
3.4103 15.36 1824000 3.2661
3.4103 15.43 1832000 3.2705
3.4074 15.5 1840000 3.2670
3.4074 15.56 1848000 3.2619
3.4167 15.63 1856000 3.2624
3.4167 15.7 1864000 3.2552
3.4195 15.77 1872000 3.2503
3.4195 15.83 1880000 3.2606
3.4091 15.9 1888000 3.2812
3.4091 15.97 1896000 3.2837
3.4116 16.04 1904000 3.2658
3.4116 16.1 1912000 3.2676
3.4183 16.17 1920000 3.2770
3.4183 16.24 1928000 3.2756
3.4177 16.31 1936000 3.2876
3.4177 16.37 1944000 3.2612
3.4226 16.44 1952000 3.2748
3.4226 16.51 1960000 3.2679
3.4154 16.58 1968000 3.2659
3.4154 16.64 1976000 3.2689
3.4199 16.71 1984000 3.2701
3.4199 16.78 1992000 3.2564
3.4166 16.85 2000000 3.2714
3.4166 16.91 2008000 3.2738
3.4054 16.98 2016000 3.2633
3.4054 17.05 2024000 3.2574
3.4022 17.11 2032000 3.2637
3.4022 17.18 2040000 3.2688
3.408 17.25 2048000 3.2667
3.408 17.32 2056000 3.2578
3.4065 17.38 2064000 3.2605
3.4065 17.45 2072000 3.2768
3.4105 17.52 2080000 3.2569
3.4105 17.59 2088000 3.2519
3.4011 17.65 2096000 3.2555
3.4011 17.72 2104000 3.2488
3.4078 17.79 2112000 3.2516
3.4078 17.86 2120000 3.2527
3.4105 17.92 2128000 3.2561
3.4105 17.99 2136000 3.2580
3.4054 18.06 2144000 3.2453
3.4054 18.13 2152000 3.2426
3.3937 18.19 2160000 3.2517
3.3937 18.26 2168000 3.2446
3.4001 18.33 2176000 3.2449
3.4001 18.39 2184000 3.2527
3.413 18.46 2192000 3.2557
3.413 18.53 2200000 3.2483
3.3882 18.6 2208000 3.2520
3.3882 18.66 2216000 3.2354
3.3974 18.73 2224000 3.2540
3.3974 18.8 2232000 3.2426
3.3864 18.87 2240000 3.2341
3.3864 18.93 2248000 3.2408
3.3896 19.0 2256000 3.2342
3.3896 19.07 2264000 3.2415
3.3845 19.14 2272000 3.2445
3.3845 19.2 2280000 3.2422
3.3916 19.27 2288000 3.2379
3.3916 19.34 2296000 3.2411
3.3919 19.41 2304000 3.2429
3.3919 19.47 2312000 3.2372
3.39 19.54 2320000 3.2380
3.39 19.61 2328000 3.2353
3.3905 19.68 2336000 3.2327
3.3905 19.74 2344000 3.2494
3.3826 19.81 2352000 3.2369
3.3826 19.88 2360000 3.2390
3.3935 19.94 2368000 3.2415
3.3935 20.01 2376000 3.2486
3.3846 20.08 2384000 3.2354
3.3846 20.15 2392000 3.2466
3.3875 20.21 2400000 3.2425

Framework versions

  • Transformers 4.35.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
32
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DouglasPontes/2020-Q2-75p-filtered

Finetuned
(34)
this model