Edit model card

bert-base-cased-finetuned-Stromberg_NLP_Twitter-PoS

This model is a fine-tuned version of bert-base-cased on the twitter_pos_vcb dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0533

  • '''

    • Precision: 0.9580645161290322
    • Recall: 0.9519230769230769
    • F1: 0.954983922829582
    • Number': 312
  • B

    • Precision: 0.9658270558694287
    • Recall: 0.9655240037652966
    • F1: 0.9656755060411109
    • Number: 25496
  • Bd

    • Precision: 0.9630099728014506
    • Recall: 0.9572819033886085
    • F1: 0.9601373949200036
    • Number: 5548
  • Bg

    • Precision: 0.9836065573770492
    • Recall: 0.9853434575313438
    • F1: 0.9844742413549753
    • Number: 5663
  • Bn

    • Precision: 0.9182209469153515
    • Recall: 0.9116809116809117
    • F1: 0.9149392423159399
    • Number: 2106
  • Bp

    • Precision: 0.9672037914691943
    • Recall: 0.9663488856619736
    • F1: 0.9667761495704902
    • Number': 15839
  • Br

    • Precision: 0.94
    • Recall: 0.8785046728971962
    • F1: 0.9082125603864735
    • Number': 107
  • Bs

    • Precision: 0.9848484848484849
    • Recall': 0.9701492537313433
    • F1: 0.9774436090225564
    • Number': 67
  • Bz

    • Precision: 0.9865819209039548
    • Recall: 0.9850167459897762
    • F1: 0.9857987121813531
    • Number': 5673
  • C

    • Precision: 0.9993461203138623,
    • Recall: 0.9993461203138623,
    • F1: 0.9993461203138623,
    • Number: 4588
  • D

    • Precision: 0.9876836325864372
    • Recall: 0.9895926256318763
    • F1: 0.988637207575195
    • Number: 6726
  • Dt

    • Precision: 1.0
    • Recall: 0.8
    • F1: 0.888888888888889
    • Number: 15
  • H

    • Precision: 0.9487382595903587
    • Recall: 0.9305216426193119
    • F1: 0.9395416596626883
    • Number: 9010
  • J

    • Precision: 0.9803528468323978
    • Recall: 0.980588754311382
    • F1: 0.9804707863816818
    • Number: 12467
  • Jr

    • Precision: 0.9400386847195358
    • Recall: 0.9818181818181818
    • F1': 0.9604743083003953
    • Number': 495
  • Js

    • Precision: 0.9612141652613828
    • Recall: 0.991304347826087
    • F1: 0.9760273972602741
    • Number': 575
  • N

    • Precision: 0.9795543362923471
    • Recall: 0.9793769083475651
    • F1: 0.9794656142847902
    • Number': 38646
  • Np

    • Precision: 0.9330242966751918
    • Recall: 0.9278334128119536
    • F1: 0.9304216147286205
    • Number': 6291
  • Nps

    • Precision: 0.75
    • Recall: 0.23076923076923078
    • F1: 0.3529411764705882
    • Number: 26
  • Ns

    • Precision: 0.9691858990616282
    • Recall: 0.9773657289002557
    • F1: 0.9732586272762003
    • Number': 7820
  • O

    • Precision: 0.9984323288625675
    • Recall: 0.999302649930265
    • F1: 0.9988672998170254
    • Number: 5736
  • Os

    • Precision: 1.0
    • Recall: 0.9952267303102625
    • F1: 0.9976076555023923
    • Number: 419
  • P

    • Precision: 0.9887869520897044
    • Recall: 0.9918200408997955
    • F1: 0.9903011740684022
    • Number: 2934
  • Rb

    • Precision: 0.9971910112359551
    • Recall: 0.9983929288871033
    • F1: 0.9977916081108211
    • Number: 2489
  • Rl

    • Precision: 1.0
    • Recall: 0.9997228381374723
    • F1: 0.9998613998613999
    • Number: 3608
  • Rp

    • Precision: 0.9979960600502683
    • Recall: 0.9980638586956522
    • F1: 0.9980299582215278
    • Number: 29440
  • Rp$

    • Precision: 0.9975770162686051
    • Recall: 0.9972318339100346
    • F1: 0.9974043952240872
    • Number: 5780
  • Sr

    • Precision: 0.9998923110058152
    • Recall: 0.9998384752059442
    • F1: 0.9998653923812088
    • Number: 18573
  • T

    • Precision: 0.9987569919204475
    • Recall: 0.9984811874352779
    • F1: 0.9986190706345371
    • Number: 28970
  • W

    • Precision: 0.0
    • Recall: 0.0
    • F1: 0.0
    • number: 1
  • X

    • Precision: 0.9466666666666667,
    • Recall: 0.9594594594594594,
    • F1 0.9530201342281879,
    • Number: 74}
  • Ym

    • Precision: 0.0
    • Recall: 0.0
    • F1: 0.0
    • Number: 5
  • ' '

    • Precision: 0.9951481772882245
    • Recall: 0.9949524745984923
    • F1: 0.9950503163208444
    • Number: 15255
  • '`'

    • Precision: 0.9540229885057471
    • Recall: 0.9595375722543352
    • F1: 0.956772334293948
    • Number: 173
  • Overall

    • Precision: 0.9828
    • Recall: 0.9820
    • F1: 0.9824
    • Accuracy: 0.9860

Model description

For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Token%20Classification/Monolingual/StrombergNLP-Twitter_pos_vcb/NER%20Project%20Using%20StrombergNLP%20Twitter_pos_vcb%20Dataset.ipynb

Intended uses & limitations

This model is intended to demonstrate my ability to solve a complex problem using technology.

Training and evaluation data

Dataset Source: https://huggingface.co/datasets/strombergnlp/twitter_pos_vcb

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss ''' Precision ''' Recall ''' F1 ''' Number B Precision B Recall B F1 B Number Bd Precision Bd Recall Bd F1 Bd Number Bg Precision Bg Recall Bg F1 Bg Number Bn Precision Bn Recall Bn F1 Bn Number Bp Precision Bp Recall Bp F1 Bp Number Br Precision Br Recall Br F1 Br Number Bs precision Bs Recall Bs F1 Bs Number Bz Precision Bz Recall Bz F1 Bz Number C Precision C Recall C F1 C Number D Precision D Recall D F1 D Number Dt Precision Dt Recall Dt F1 Dt Number H Precision H Recall H F1 H Number J Precision J Recall J F1 J Number Jr Precision Jr Recall Jr F1 Jr Number Js Precision Js Recall Js F1 Js Number N Precision N Recall N F1 N Number Np Precision Np Recall Np F1 Np Number Nps Precision Nps Recall Nps F1 Nps Number Ns Precision Ns Recall Ns F1 Ns Number O Precision O Recall O F1 O Number Os Precision Os Recall Os F1 Os Number P Precision P Recall P F1 P Number Rb Precision Rb Recall Rb f1 Rb Number Rl Precision Rl Recall Rl F1 Rl Number Rp Precision Rp Recall Rp F1 Rp Number Rp$ Precision Rp$ Recall Rp$ F1 Rp$ Number Sr Precision Sr Recall Sr F1 Sr Number T Precision T recall T F1 T Number W Precision W Recall W F1 W Number X Precision X Recall X F1 X Number Ym Precision Ym Recall Ym F1 Ym Number ' ' Precision ' ' Recall ' ' F1 ' ' Number '`' Precision '`' Recall '`' F1 '`' Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.0617 1.0 7477 0.0595 0.9331 0.9391 0.9361 312 0.9563 0.9536 0.9550 25496 0.9716 0.9322 0.9515 5548 0.9811 0.9786 0.9798 5663 0.8725 0.9231 0.8971 2106 0.9556 0.9586 0.9571 15839 0.8879 0.8879 0.8879 107 0.8590 1.0 0.9241 67 0.9793 0.9834 0.9814 5673 0.9985 0.9991 0.9988 4588 0.9818 0.9886 0.9852 6726 1.0 0.8 0.8889 15 0.9391 0.9105 0.9246 9010 0.9707 0.9766 0.9736 12467 0.9212 0.9677 0.9438 495 0.9227 0.9757 0.9484 575 0.9754 0.9738 0.9746 38646 0.9158 0.9200 0.9179 6291 0.0 0.0 0.0 26 0.9657 0.9688 0.9673 7820 0.9972 0.9990 0.9981 5736 1.0 0.9928 0.9964 419 0.9771 0.9908 0.9839 2934 0.9948 0.9968 0.9958 2489 1.0 0.9997 0.9999 3608 0.9970 0.9976 0.9973 29440 0.9974 0.9954 0.9964 5780 0.9998 0.9998 0.9998 18573 0.9977 0.9982 0.9979 28970 0.0 0.0 0.0 1 0.8861 0.9459 0.9150 74 0.0 0.0 0.0 5 0.9936 0.9926 0.9931 15255 0.9540 0.9595 0.9568 173 0.9779 0.9772 0.9775 0.9821
0.0407 2.0 14954 0.0531 0.9605 0.9359 0.9481 312 0.9599 0.9646 0.9622 25496 0.9674 0.9459 0.9565 5548 0.9834 0.9825 0.9830 5663 0.8920 0.9259 0.9087 2106 0.9728 0.9569 0.9648 15839 0.9592 0.8785 0.9171 107 0.9429 0.9851 0.9635 67 0.9890 0.9825 0.9858 5673 0.9991 0.9993 0.9992 4588 0.9855 0.9896 0.9875 6726 1.0 0.8 0.8889 15 0.9498 0.9303 0.9399 9010 0.9776 0.9797 0.9786 12467 0.9125 0.9899 0.9496 495 0.9481 0.9843 0.9659 575 0.9788 0.9771 0.9779 38646 0.9252 0.9285 0.9268 6291 0.5 0.2308 0.3158 26 0.96534 0.9769 0.9711 7820 0.9976 0.9993 0.9984 5736 0.9929 0.9952 0.9940 419 0.9861 0.9928 0.9895 2934 0.9972 0.9984 0.9978 2489 1.0 0.9997 0.9999 3608 0.9986 0.9982 0.9984 29440 0.9964 0.9978 0.9971 5780 0.9999 0.9999 0.9999 18573 0.9985 0.9983 0.9984 28970 0.0 0.0 0.0 1 0.9114 0.9730 0.9412 74 0.0 0.0 0.0 5 0.9949 0.9961 0.9955 15255 0.9651 0.9595 0.9623 173 0.9817 0.9808 0.9813 0.9850
0.0246 3.0 22431 0.0533 0.9581 0.9519 0.9550 312 0.9658 0.9655 0.9657 25496 0.9630 0.9573 0.9601 5548 0.9836 0.9853 0.9845 5663 0.9182 0.9117 0.9149 2106 0.9672 0.9663 0.9668 15839 0.94 0.8785 0.9082 107 0.9848 0.9701 0.9774 67 0.9866 0.9850 0.9858 5673 0.9993 0.9993 0.9993 4588 0.9877 0.9896 0.9886 6726 1.0 0.8 0.8889 15 0.9487 0.9305 0.9395 9010 0.9804 0.9806 0.9805 12467 0.9400 0.9818 0.9605 495 0.9612 0.9913 0.9760 575 0.9796 0.9794 0.9795 38646 0.9330 0.9278 0.9304 6291 0.75 0.2308 0.3529 26 0.9692 0.9774 0.9733 7820 0.9984 0.9993 0.9989 5736 1.0 0.9952 0.9976 419 0.9888 0.9918 0.9903 2934 0.9972 0.9984 0.9978 2489 1.0 0.9997 0.9999 3608 0.9980 0.9981 0.9981 29440 0.9976 0.9972 0.9974 5780 0.9999 0.9998 0.9999 18573 0.9988 0.9985 0.9986 28970 0.0 0.0 0.0 1 0.9467 0.9595 0.9530 74 0.0 0.0 0.0 5 0.9951 0.9950 0.9951 15255 0.9540 0.9595 0.9568 173 0.9828 0.9820 0.9824 0.9860

Framework versions

  • Transformers 4.28.1
  • Pytorch 2.0.0
  • Datasets 2.11.0
  • Tokenizers 0.13.3
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including DunnBC22/bert-base-cased-finetuned-Stromberg_NLP_Twitter-PoS