This Falcon 7B was fined-tuned on nuclear energy data from twitter/X for text classification task. The classification accuracy obtained is 96%.
The number of labels is 3: {0: Negative, 1: Neutral, 2: Positive}
Warning: You need enough GPU to use Falcon.

This is an example to use it, it worked on 8 GB VRAM Nvidia RTX-4060

from transformers import AutoTokenizer
from transformers import pipeline
from transformers import AutoModelForSequenceClassification
import torch

checkpoint = 'kumo24/falcon-sentiment-nuclear'
tokenizer=AutoTokenizer.from_pretrained(checkpoint)
id2label = {0: "negative", 1: "neutral", 2: "positive"}
label2id = {"negative": 0, "neutral": 1, "positive": 2}
    

if tokenizer.pad_token is None:
    tokenizer.add_special_tokens({'pad_token': '[PAD]'})

model = AutoModelForSequenceClassification.from_pretrained(checkpoint, 
                                                       num_labels=3,
                                                       id2label=id2label, 
                                                       label2id=label2id,
                                                       device_map='auto')

sentiment_task = pipeline("sentiment-analysis", 
                          model=model, 
                          tokenizer=tokenizer)

print(sentiment_task("Michigan Wolverines are Champions, Go Blue!"))
Downloads last month
4
Safetensors
Model size
6.92B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.