Edit model card

Model Card: DistilBERT-based Joke Detection (needed this because I'm German)

Model Details

  • Model Type: Fine-tuned DistilBERT base model (uncased)
  • Task: Binary classification for joke detection
  • Output: Joke or No-joke sentiment

Training Data

Base Model

DistilBERT base model (uncased), a distilled version of BERT optimized for efficiency while maintaining performance.

Usage

from transformers import pipeline

model_id = "VitalContribution/JokeDetectBERT"
pipe = pipeline('text-classification', model=model_id)

joke_questionmark = "What do elves learn in school? The elf-abet."

out = pipe(joke_questionmark)[0]
label = out['label']
confidence = out['score']
result = "JOKE" if label == 'LABEL_1' else "NO JOKE"
print(f"Prediction: {result} ({confidence:.2f})")

Training Details

Parameter Value
Model DistilBERT (base-uncased)
Task Sequence Classification
Number of Classes 2
Batch Size 32 (per device)
Learning Rate 2e-4
Weight Decay 0.01
Epochs 2
Warmup Steps 100
Best Model Selection Based on eval_loss

Model Evaluation

Model Evaluation Image 1 Model Evaluation Image 2
Downloads last month
14
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.