stefan-it's picture
readme: fix typo
caa7d1d verified
metadata
language: de
license: mit
tags:
  - flair
  - token-classification
  - sequence-tagger-model
  - hetzner
  - hetzner-gex44
  - hetzner-gpu
base_model: dbmdz/bert-base-german-cased
datasets:
  - stefan-it/co-funer
widget:
  - text: >-
      Wesentliche Tätigkeiten der Compliance-Funktion wurden an die
      Mercurtainment AG , Düsseldorf , ausgelagert .

Fine-tuned Flair Model on CO-Fun NER Dataset

This Flair model was fine-tuned on the CO-Fun NER Dataset using German DBMDZ BERT as backbone LM.

Dataset

The Company Outsourcing in Fund Prospectuses (CO-Fun) dataset consists of 948 sentences with 5,969 named entity annotations, including 2,340 Outsourced Services, 2,024 Companies, 1,594 Locations and 11 Software annotations.

Overall, the following named entities are annotated:

  • Auslagerung (engl. outsourcing)
  • Unternehmen (engl. company)
  • Ort (engl. location)
  • Software

Fine-Tuning

The latest Flair version is used for fine-tuning.

A hyper-parameter search over the following parameters with 5 different seeds per configuration is performed:

  • Batch Sizes: [8, 16]
  • Learning Rates: [5e-05, 3e-05]

More details can be found in this repository. All models are fine-tuned on a Hetzner GEX44 with an NVIDIA RTX 4000.

Results

A hyper-parameter search with 5 different seeds per configuration is performed and micro F1-score on development set is reported:

Configuration Seed 1 Seed 2 Seed 3 Seed 4 Seed 5 Average
bs8-e10-lr5e-05 0.9378 0.928 0.9383 0.9374 0.9364 0.9356 ± 0.0043
bs8-e10-lr3e-05 0.9336 0.9366 0.9299 0.9417 0.9281 0.934 ± 0.0054
bs16-e10-lr5e-05 0.927 0.9341 0.9372 0.9283 0.9329 0.9319 ± 0.0042
bs16-e10-lr3e-05 0.9141 0.9321 0.9175 0.9391 0.9177 0.9241 ± 0.0109

The result in bold shows the performance of the current viewed model.

Additionally, the Flair training log and TensorBoard logs are also uploaded to the model hub.