AnkitAI's picture
Update README.md
d31ad75 verified
|
raw
history blame
2.29 kB
metadata
license: mit
base_model: cardiffnlp/twitter-roberta-base-sentiment-latest
language:
  - en
library_name: transformers
tags:
  - Roberta
  - Sentiment Analysis
widget:
  - text: This product is great!

🌟 Fine-tuned RoBERTa for Sentiment Analysis on Reviews 🌟

This is a fine-tuned version of cardiffnlp/twitter-roberta-base-sentiment-latest on the Amazon Reviews dataset for sentiment analysis.

πŸ“œ Model Details

  • πŸ†• Model Name: AnkitAI/reviews-roberta-base-sentiment-analysis
  • πŸ”— Base Model: cardiffnlp/twitter-roberta-base-sentiment-latest
  • πŸ“Š Dataset: Amazon Reviews
  • βš™οΈ Fine-tuning: This model was fine-tuned for sentiment analysis with a classification head for binary sentiment classification (positive and negative).

πŸ‹οΈ Training

The model was trained using the following parameters:

  • πŸ”§ Learning Rate: 2e-5
  • πŸ“¦ Batch Size: 16
  • ⏳ Epochs: 3
  • βš–οΈ Weight Decay: 0.01
  • πŸ“… Evaluation Strategy: Epoch

πŸ‹οΈ Training Details

  • πŸ“‰ Eval Loss: 0.1049
  • ⏱️ Eval Runtime: 3177.538 seconds
  • πŸ“ˆ Eval Samples/Second: 226.591
  • πŸŒ€ Eval Steps/Second: 7.081
  • πŸ”„ Epoch: 3.0
  • πŸƒ Train Runtime: 110070.6349 seconds
  • πŸ“Š Train Samples/Second: 78.495
  • πŸŒ€ Train Steps/Second: 2.453
  • πŸ“‰ Train Loss: 0.0858

πŸš€ Usage

You can use this model directly with the Hugging Face transformers library:

from transformers import RobertaForSequenceClassification, RobertaTokenizer

model_name = "AnkitAI/reviews-roberta-base-sentiment-analysis"
model = RobertaForSequenceClassification.from_pretrained(model_name)
tokenizer = RobertaTokenizer.from_pretrained(model_name)

# Example usage
inputs = tokenizer("This product is great!", return_tensors="pt")
outputs = model(**inputs) # 1 for positive, 0 for negative

# Get sentiment
logits = outputs.logits
print(logits)

Note: LABEL_1 indicates positive sentiment and LABEL_0 indicates negative sentiment.

πŸ“œ License

This model is licensed under the MIT License.