shalinialisha
commited on
Commit
•
e9ea193
1
Parent(s):
96c9987
Update README.md
Browse files
README.md
CHANGED
@@ -4,4 +4,23 @@ datasets:
|
|
4 |
language:
|
5 |
- en
|
6 |
pipeline_tag: text-classification
|
7 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
language:
|
5 |
- en
|
6 |
pipeline_tag: text-classification
|
7 |
+
---
|
8 |
+
|
9 |
+
# Sentiment Analysis with BERT
|
10 |
+
|
11 |
+
This BERT-based model for sentiment analysis was created by me as a student completing Vanderbilt Data Science Institute's AI Summer Course in 2023. It serves as an introductory example of fine-tuning a pretrained model for a downstream task.
|
12 |
+
|
13 |
+
## What I Learned
|
14 |
+
|
15 |
+
- Leveraging transfer learning instead of training a model from scratch
|
16 |
+
- Fine-tuning a pretrained model on a downstream dataset
|
17 |
+
- Implementing optimizations like learning rate scheduling
|
18 |
+
- Evaluating models using relevant metrics like accuracy
|
19 |
+
|
20 |
+
## About the Project
|
21 |
+
|
22 |
+
During the program, I explored various techniques for adapting powerful large-scale models like BERT to specialized applications. As a hands-on exercise, I fine-tuned BERT using the tweet_eval dataset to classify text snippets as either positive or negative in sentiment.
|
23 |
+
|
24 |
+
This model is the result of that exercise, providing my basic implementation of sentiment classification using BERT fine-tuning. While not as performant as state-of-the-art sentiment models, it demonstrates the workflow and techniques I learned around tailoring BERT and similar models.
|
25 |
+
|
26 |
+
The training code is provided to allow replication and customization for other datasets. I hope this model provides a useful case study for anyone beginning their journey into fine-tuning and transfer learning with transformer models like I was!
|