You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.


BERT Named Entity Recognition (NER) Model

This model performs Named Entity Recognition (NER) using a fine-tuned version of BERT (Bidirectional Encoder Representations from Transformers) based on the bert-base-cased model from Google. It identifies entities in a given text, such as people, organizations, and locations. The model has been fine-tuned on a custom NER dataset.


Model Overview

  • Model: BERT-based NER
  • Base model: bert-base-cased
  • Language: English (en)
  • Task: Text Classification (NER)

Dataset

The dataset used for training is available here.


Metrics

This model has been evaluated on various metrics:

  • Evaluation Loss: 0.176
  • Precision: 82.5%
  • Recall: 77.1%
  • F1 Score: 79.7%
  • Accuracy: Reported during evaluation.

Usage

You can use this model in two ways: via direct code execution or by running a Docker container.

Option 1: Direct Code Execution

To use this model via code, you’ll first need to log in to Hugging Face CLI and enter your access token.

Step 1: Install Hugging Face Transformers and CLI

If you haven’t already, install the necessary dependencies:

pip install transformers huggingface_hub

Step 2: Login to Hugging Face

Use the following command to log in to your Hugging Face account:

huggingface-cli login

You'll be prompted to enter your Hugging Face access token. You can find your token here.

Step 3: Use the Model

Once logged in, you can use the following Python code to load and run the model for NER tasks:

from transformers import AutoTokenizer, AutoModelForTokenClassification, pipeline

# Load the model and tokenizer
model_name = "sriramrokkam/bert_ner"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForTokenClassification.from_pretrained(model_name)

# Create a pipeline for NER
nlp = pipeline("ner", model=model, tokenizer=tokenizer)

# Interactive example
input_text = input("Enter a sentence for NER: ")
ner_results = nlp(input_text)

# Display results
for entity in ner_results:
    print(f"Entity: {entity['word']}, Label: {entity['entity']}, Confidence: {entity['score']:.2f}")

Option 2: Docker

You can also run this model using Docker for a simple and interactive NER chatbot interface.

Prerequisites

Make sure you have Docker installed on your system.

Step 1: Pull the Docker Image

Pull the Docker image from Docker Hub:

docker pull sriramrokkam/ner-chatbot-web

View on Docker Hub

Step 2: Run the Docker Container

Run the container to start the web-based NER chatbot interface:

docker run -p 8080:8080 --name BERT_NER_Chatbot -it sriramrokkam/ner-chatbot-web

Step 3: Access the Web Interface

Once the container is running, open your browser and go to:

http://localhost:8080

This will open a professional chatbot interface where you can enter text, and the model will perform Named Entity Recognition (NER) on the input.


License

This project is licensed under the MIT License.


Contact Information


Downloads last month
0
Safetensors
Model size
108M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for SriramRokkam/BERT_NER

Finetuned
(2759)
this model

Dataset used to train SriramRokkam/BERT_NER