arnabdhar commited on
Commit
d616e23
1 Parent(s): a08502b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -13
README.md CHANGED
@@ -12,6 +12,17 @@ metrics:
12
  model-index:
13
  - name: distilbert-base-amazon-multi
14
  results: []
 
 
 
 
 
 
 
 
 
 
 
15
  ---
16
 
17
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -25,20 +36,10 @@ It achieves the following results on the evaluation set:
25
  - Accuracy: 0.6055
26
  - Matthews Correlation: 0.5072
27
 
28
- ## Model description
29
-
30
- More information needed
31
-
32
- ## Intended uses & limitations
33
-
34
- More information needed
35
-
36
- ## Training and evaluation data
37
-
38
- More information needed
39
-
40
  ## Training procedure
41
 
 
 
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
@@ -72,4 +73,4 @@ The following hyperparameters were used during training:
72
  - Transformers 4.35.2
73
  - Pytorch 2.1.0+cu118
74
  - Datasets 2.15.0
75
- - Tokenizers 0.15.0
 
12
  model-index:
13
  - name: distilbert-base-amazon-multi
14
  results: []
15
+ datasets:
16
+ - mteb/amazon_reviews_multi
17
+ language:
18
+ - en
19
+ - de
20
+ - es
21
+ - fr
22
+ - ja
23
+ - zh
24
+ library_name: transformers
25
+ pipeline_tag: text-classification
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
36
  - Accuracy: 0.6055
37
  - Matthews Correlation: 0.5072
38
 
 
 
 
 
 
 
 
 
 
 
 
 
39
  ## Training procedure
40
 
41
+ This model was fine tuned on Google Colab using a single **NVIDIA V100** GPU with 16GB of VRAM. It took around 13 hours to finish the finetuning of 10_000 steps.
42
+
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
 
73
  - Transformers 4.35.2
74
  - Pytorch 2.1.0+cu118
75
  - Datasets 2.15.0
76
+ - Tokenizers 0.15.0