Details: https://spacy.io/models/ja#ja_core_news_md
Japanese pipeline optimized for CPU. Components: tok2vec, morphologizer, parser, senter, ner, attribute_ruler.
Feature | Description |
---|---|
Name | ja_core_news_md |
Version | 3.7.0 |
spaCy | >=3.7.0,<3.8.0 |
Default Pipeline | tok2vec , morphologizer , parser , attribute_ruler , ner |
Components | tok2vec , morphologizer , parser , senter , attribute_ruler , ner |
Vectors | 480443 keys, 20000 unique vectors (300 dimensions) |
Sources | UD Japanese GSD v2.8 (Omura, Mai; Miyao, Yusuke; Kanayama, Hiroshi; Matsuda, Hiroshi; Wakasa, Aya; Yamashita, Kayo; Asahara, Masayuki; Tanaka, Takaaki; Murawaki, Yugo; Matsumoto, Yuji; Mori, Shinsuke; Uematsu, Sumire; McDonald, Ryan; Nivre, Joakim; Zeman, Daniel) UD Japanese GSD v2.8 NER (Megagon Labs Tokyo) chiVe: Japanese Word Embedding with Sudachi & NWJC (chive-1.1-mc90-500k) (Works Applications) |
License | CC BY-SA 4.0 |
Author | Explosion |
Label Scheme
View label scheme (65 labels for 3 components)
Component | Labels |
---|---|
morphologizer |
POS=NOUN , POS=ADP , POS=VERB , POS=SCONJ , POS=AUX , POS=PUNCT , POS=PART , POS=DET , POS=NUM , POS=ADV , POS=PRON , POS=ADJ , POS=PROPN , POS=CCONJ , POS=SYM , POS=NOUN|Polarity=Neg , POS=AUX|Polarity=Neg , POS=SPACE , POS=INTJ , POS=SCONJ|Polarity=Neg |
parser |
ROOT , acl , advcl , advmod , amod , aux , case , cc , ccomp , compound , cop , csubj , dep , det , dislocated , fixed , mark , nmod , nsubj , nummod , obj , obl , punct |
ner |
CARDINAL , DATE , EVENT , FAC , GPE , LANGUAGE , LAW , LOC , MONEY , MOVEMENT , NORP , ORDINAL , ORG , PERCENT , PERSON , PET_NAME , PHONE , PRODUCT , QUANTITY , TIME , TITLE_AFFIX , WORK_OF_ART |
Accuracy
Type | Score |
---|---|
TOKEN_ACC |
99.37 |
TOKEN_P |
97.64 |
TOKEN_R |
97.88 |
TOKEN_F |
97.76 |
POS_ACC |
97.12 |
MORPH_ACC |
0.00 |
MORPH_MICRO_P |
34.01 |
MORPH_MICRO_R |
98.04 |
MORPH_MICRO_F |
50.51 |
SENTS_P |
96.30 |
SENTS_R |
97.63 |
SENTS_F |
96.96 |
DEP_UAS |
92.21 |
DEP_LAS |
90.92 |
TAG_ACC |
97.13 |
LEMMA_ACC |
96.70 |
ENTS_P |
72.67 |
ENTS_R |
67.55 |
ENTS_F |
70.01 |
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Evaluation results
- NER Precisionself-reported0.727
- NER Recallself-reported0.675
- NER F Scoreself-reported0.700
- TAG (XPOS) Accuracyself-reported0.971
- POS (UPOS) Accuracyself-reported0.971
- Morph (UFeats) Accuracyself-reported0.000
- Lemma Accuracyself-reported0.967
- Unlabeled Attachment Score (UAS)self-reported0.922
- Labeled Attachment Score (LAS)self-reported0.909
- Sentences F-Scoreself-reported0.970