en_core_web_sm / README.md
elishowk's picture
Automatic correction of README.md metadata for keys. Contact website@huggingface.co for any question
51059b6
|
raw
history blame
3.75 kB
metadata
tags:
  - spacy
  - token-classification
language:
  - en
license: mit
model-index:
  - name: en_core_web_sm
    results:
      - task:
          name: NER
          type: token-classification
        metrics:
          - name: NER Precision
            type: precision
            value: 0.8424355924
          - name: NER Recall
            type: recall
            value: 0.8335336538
          - name: NER F Score
            type: f_score
            value: 0.8379609817
      - task:
          name: POS
          type: token-classification
        metrics:
          - name: POS Accuracy
            type: accuracy
            value: 0.9720712187
      - task:
          name: SENTER
          type: token-classification
        metrics:
          - name: SENTER Precision
            type: precision
            value: 0.9074955788
          - name: SENTER Recall
            type: recall
            value: 0.8801372122
          - name: SENTER F Score
            type: f_score
            value: 0.893607046
      - task:
          name: UNLABELED_DEPENDENCIES
          type: token-classification
        metrics:
          - name: Unlabeled Dependencies Accuracy
            type: accuracy
            value: 0.9185392711
      - task:
          name: LABELED_DEPENDENCIES
          type: token-classification
        metrics:
          - name: Labeled Dependencies Accuracy
            type: accuracy
            value: 0.9185392711

Details: https://spacy.io/models/en#en_core_web_sm

English pipeline optimized for CPU. Components: tok2vec, tagger, parser, senter, ner, attribute_ruler, lemmatizer.

Feature Description
Name en_core_web_sm
Version 3.1.0
spaCy >=3.1.0,<3.2.0
Default Pipeline tok2vec, tagger, parser, attribute_ruler, lemmatizer, ner
Components tok2vec, tagger, parser, senter, attribute_ruler, lemmatizer, ner
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources OntoNotes 5 (Ralph Weischedel, Martha Palmer, Mitchell Marcus, Eduard Hovy, Sameer Pradhan, Lance Ramshaw, Nianwen Xue, Ann Taylor, Jeff Kaufman, Michelle Franchini, Mohammed El-Bachouti, Robert Belvin, Ann Houston)
ClearNLP Constituent-to-Dependency Conversion (Emory University)
WordNet 3.0 (Princeton University)
License MIT
Author Explosion

Label Scheme

View label scheme (114 labels for 4 components)
Component Labels
tagger $, '', ,, -LRB-, -RRB-, ., :, ADD, AFX, CC, CD, DT, EX, FW, HYPH, IN, JJ, JJR, JJS, LS, MD, NFP, NN, NNP, NNPS, NNS, PDT, POS, PRP, PRP$, RB, RBR, RBS, RP, SYM, TO, UH, VB, VBD, VBG, VBN, VBP, VBZ, WDT, WP, WP$, WRB, XX, ````
parser ROOT, acl, acomp, advcl, advmod, agent, amod, appos, attr, aux, auxpass, case, cc, ccomp, compound, conj, csubj, csubjpass, dative, dep, det, dobj, expl, intj, mark, meta, neg, nmod, npadvmod, nsubj, nsubjpass, nummod, oprd, parataxis, pcomp, pobj, poss, preconj, predet, prep, prt, punct, quantmod, relcl, xcomp
senter I, S
ner CARDINAL, DATE, EVENT, FAC, GPE, LANGUAGE, LAW, LOC, MONEY, NORP, ORDINAL, ORG, PERCENT, PERSON, PRODUCT, QUANTITY, TIME, WORK_OF_ART

Accuracy

Type Score
TOKEN_ACC 99.93
TAG_ACC 97.21
DEP_UAS 91.85
DEP_LAS 90.02
ENTS_P 84.24
ENTS_R 83.35
ENTS_F 83.80
SENTS_P 90.75
SENTS_R 88.01
SENTS_F 89.36