--- language: - en license: apache-2.0 library_name: span-marker tags: - span-marker - token-classification - ner - named-entity-recognition datasets: - DFKI-SLT/few-nerd metrics: - f1 - recall - precision pipeline_tag: token-classification widget: - text: Amelia Earhart flew her single engine Lockheed Vega 5B across the Atlantic to Paris. example_title: Amelia Earhart - text: Leonardo di ser Piero da Vinci painted the Mona Lisa based on Italian noblewoman Lisa del Giocondo. example_title: Leonardo da Vinci base_model: prajjwal1/bert-tiny model-index: - name: SpanMarker w. bert-base-cased on coarsegrained, supervised FewNERD by Tom Aarsen results: - task: type: token-classification name: Named Entity Recognition dataset: name: coarsegrained, supervised FewNERD type: DFKI-SLT/few-nerd config: supervised split: test revision: 2e3e727c63604fbfa2ff4cc5055359c84fe5ef2c metrics: - type: f1 value: 0.7081 name: F1 - type: precision value: 0.7378 name: Precision - type: recall value: 0.6808 name: Recall --- # SpanMarker for Named Entity Recognition This is a [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) model that can be used for Named Entity Recognition. In particular, this SpanMarker model uses [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) as the underlying encoder. ## Note This model is primarily used for efficient tests on the [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) GitHub repository. ## Usage To use this model for inference, first install the `span_marker` library: ```bash pip install span_marker ``` You can then run inference with this model like so: ```python from span_marker import SpanMarkerModel # Download from the 🤗 Hub model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-bert-tiny-fewnerd-coarse-super") # Run inference entities = model.predict("Amelia Earhart flew her single engine Lockheed Vega 5B across the Atlantic to Paris.") ``` See the [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) repository for documentation and additional information on this library.