BERT Classification

Model Overview

  • Model Name: BERT Classification
  • Model Type: Text Classification
  • Developer: Mansoor Hamidzadeh
  • Framework: Transformers
  • Language: English
  • License: Apache-2.0

Model Description

This model is a fine-tuned BERT (Bidirectional Encoder Representations from Transformers) designed for text classification tasks. It categorizes text into four labels:

  • Label 1: Household
  • Label 2: Books
  • Label 3: Clothing & Accessories
  • Label 4: Electronics

Technical Details

  • Model Size: 109M parameters
  • Tensor Type: F32
  • File Format: Safetensors

How To Use

# Use a pipeline as a high-level helper
from transformers import pipeline

text=''
pipe = pipeline("text-classification", model="mansoorhamidzadeh/bert_classification")
pipe(text)

Usage

The model is useful for categorizing product descriptions or similar text data into predefined labels.

Citation

If you use this model in your research or applications, please cite it as follows:

@misc{mansoorhamidzadeh/bert_classification,
  author = {mansoorhamidzadeh},
  title = {English to Persian Translation using MT5-Small},
  year = {2024},
  publisher = {Hugging Face},
  howpublished = {\url{https://huggingface.co/mansoorhamidzadeh/bert_classification}},
}
Downloads last month
111
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for mansoorhamidzadeh/bert_classification

Finetuned
(2649)
this model