BERT fine-tuned for Named Entity Recognition in Danish

The model tags tokens (in Danish sentences) with named entity tags (BIO format) [PER, ORG, LOC, MISC]. The pretrained language model used for fine-tuning is the Danish BERT by BotXO.

See the DaNLP documentation for more details.

Here is how to use the model:

from transformers import BertTokenizer, BertForTokenClassification

model = BertForTokenClassification.from_pretrained("alexandrainst/da-ner-base")
tokenizer = BertTokenizer.from_pretrained("alexandrainst/da-ner-base")

Training Data

The model has been trained on the DaNE.

Downloads last month
78
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train alexandrainst/da-ner-base