DT12the commited on
Commit
8b4a863
1 Parent(s): 452ed47

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: "en"
3
+ license: "mit"
4
+ tags:
5
+ - distilbert
6
+ - sentiment-analysis
7
+ - multilingual
8
+ widgets:
9
+ - text: "I love this movie!"
10
+ ---
11
+
12
+ # Model Name: DistilBERT for Sentiment Analysis
13
+
14
+ ## Model Description
15
+
16
+ ### Overview
17
+
18
+ This model is a fine-tuned version of `distilbert-base-uncased` on a social media dataset for the purpose of sentiment analysis. It can classify text into positive, negative, and neutral sentiments.
19
+
20
+ ### Intended Use
21
+
22
+ This model is intended for sentiment analysis tasks, particularly for analyzing social media texts. It supports multiple languages, making it versatile for international applications.
23
+
24
+ ### Model Architecture
25
+
26
+ This model is based on the `DistilBertForSequenceClassification` architecture, a distilled version of BERT that maintains comparable performance on downstream tasks while being more computationally efficient.
27
+
28
+ ## Training
29
+
30
+ ### Training Data
31
+
32
+ The model was trained on a dataset consisting of social media posts, labeled for sentiment (positive, negative, neutral). The dataset includes multiple languages, enhancing the model's multilingual capabilities.
33
+
34
+ ### Training Procedure
35
+
36
+ The model was trained using the following parameters:
37
+ - Optimizer: AdamW
38
+ - Learning Rate: 5e-5
39
+ - Batch Size: 32
40
+ - Epochs: 30
41
+
42
+ Training was conducted on Kaggle, utilizing two GPUs for accelerated training.