Edit model card

Model Description

This model was created for a thesis. This model was created with a dataset obtained from social media X, with hashtag #IsraelPalestineWar, collected from October - November 2023. This model is meant to classify sentiment of tweets and suitable for english language.

Model Labels:

  • Label 0: Negative
  • Label 1: Neutral
  • Label 2: Positive

How to use model

  • Loading the model
from transformers import RobertaForSequenceClassification, RobertaTokenizer
import torch

output_model_dir = 'RappyProgramming/IPW-RoBERTa-uncased'
model = RobertaForSequenceClassification.from_pretrained(output_model_dir)
tokenizer = RobertaTokenizer.from_pretrained(output_model_dir)
  • Sample Input
input_texts = [
    "this meeting is scheduled for next week",
    "drop dead",
    "you're the best friend i could ever have in this whole wide world!!"
]
  • Output
inputs = tokenizer(input_texts, return_tensors="pt", padding=True, truncation=True)
with torch.no_grad():
    outputs = model(**inputs)

predicted_class_indices = torch.argmax(outputs.logits, dim=1).tolist()
probs = torch.softmax(outputs.logits, dim=1).tolist()
labels = ["Negative", "Neutral", "Positive"]

for i, input_text in enumerate(input_texts):
    predicted_label = labels[predicted_class_indices[i]]
    predicted_probabilities = {label: prob for label, prob in zip(labels, probs[i])}
    
    print(f"Input text {i+1}: {input_text}")
    print(f"Predicted label: {predicted_label}")
    print("Predicted probabilities:")
    
    for label, prob in predicted_probabilities.items():
        print(f"{label}: {prob:.4f}")
        
    print()
Output:

Input text 1: this meeting is scheduled for next week
Predicted label: Neutral
Predicted probabilities:
Negative: 0.0002
Neutral: 0.9985
Positive: 0.0012

Input text 2: drop dead
Predicted label: Negative
Predicted probabilities:
Negative: 0.5771
Neutral: 0.4225
Positive: 0.0005

Input text 3: you're the best friend i could ever have in this whole wide world!!
Predicted label: Positive
Predicted probabilities:
Negative: 0.0003
Neutral: 0.0001
Positive: 0.9996
Downloads last month
6
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.