Edit model card

Elon Musk Detector (EMD)

Elon Musk Detector uses DistilBERT to classify is a tweet (or post) was made by Elon Musk or Not.

Model Details

This model uses transformers and DistilBERT to classify text, based on 2 datasets.

Model Description

  • Developed by: Kokohachi
  • Model type: BERT
  • Language(s) (NLP): English
  • License: BigScience OpenRAIL-M
  • Finetuned from model: DistilBERT

Uses

This model should be used only for academic purposes, and should not be used for business purposes.

Direct Use

git clone https://huggingface.co/kix-intl/elon-musk-detector

from transformers import DistilBERTTokenizer, DistilBERTModel

Load the tokenizer

loaded_tokenizer = DistilBERTTokenizer.from_pretrained(save_directory)

Load the model

loaded_model = DistilBERTModel.from_pretrained(save_directory)

Training Details

Training Data

Elon Musk Tweets

https://www.kaggle.com/datasets/gpreda/elon-musk-tweets

Twitter Tweets Sentiment Dataset

https://www.kaggle.com/datasets/yasserh/twitter-tweets-sentiment-dataset

Training Procedure

https://huggingface.co/kix-intl/elon-musk-detector/blob/main/EMD.ipynb

Model Card Contact

koko8.dev@gmail.com

Downloads last month
8
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using kix-intl/elon-musk-detector 1