Edit model card

Model Description

mountain-ner-bert-base is a fine-tuned model based on the BERT base architecture for mountain names Entity Recognition tasks. The model is trained on the merging of two datasets: NERetrieve, Few-NERD, Mountain-ner-dataset. The model is trained to recognize two types of entities: LABEL_0 (other), LABEL_1 (mountain names).

  • Model Architecture: BERT base
  • Task: mountain names entity recognition
  • Training Data: mountain-ner-dataset

Performance

Metrics:

Epoch Training Loss Validation Loss Accuracy Precision Recall F1
1 0.027400 0.030793 0.988144 0.815692 0.924621 0.866748
2 0.020600 0.024568 0.991119 0.872988 0.921036 0.896369
3 0.012900 0.024072 0.991923 0.889878 0.920171 0.904771

Best model performance achieved at epoch 3 with:

  • F1 Score: 0.9048
  • Accuracy: 0.9919
  • Precision: 0.8899
  • Recall: 0.9202

How to use

from transformers import AutoModel, AutoTokenizer, pipeline

model = AutoModel.from_pretrained("Gepe55o/mountain-ner-bert-base")
tokenizer = AutoTokenizer.from_pretrained("Gepe55o/mountain-ner-bert-base")

text = "Mount Everest is the highest mountain in the world."

nlp = pipeline("ner", model=model, tokenizer=tokenizer)
result = nlp(text)
Downloads last month
14
Safetensors
Model size
108M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Evaluation results