ITT-AF commited on
Commit
e1ae4a4
1 Parent(s): 2663f73

Update README.md (#1)

Browse files

- Update README.md (5cdefbfa20e154eb0c1b0e4214e23a6887ff0c41)

Files changed (1) hide show
  1. README.md +45 -3
README.md CHANGED
@@ -1,3 +1,45 @@
1
- ---
2
- license: cc-by-nc-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ ---
4
+ # ITT-AF/ITT-AF-PLM-1.4B_v0.2
5
+
6
+ This model is a pretrained version on custom dataset(110G).
7
+
8
+ ## Model description
9
+
10
+ More information needed
11
+
12
+ ## Intended uses & limitations
13
+
14
+ More information needed
15
+
16
+ ## Training and evaluation data
17
+
18
+ The data used to train the model is collected from various sources, mostly from the Web.
19
+ As such, it contains offensive, harmful and biased content.
20
+ We thus expect the model to exhibit such biases from the training data.
21
+
22
+ ## Training procedure
23
+
24
+ ### Training hyperparameters
25
+
26
+ The following hyperparameters were used during training:
27
+ - learning_rate: 2e-05
28
+ - train_batch_size: 24
29
+ - eval_batch_size: 8
30
+ - seed: 42
31
+ - gradient_accumulation_steps: 4
32
+ - total_train_batch_size: 96
33
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
34
+ - lr_scheduler_type: linear
35
+ - num_epochs: 1.0
36
+ - mixed_precision_training: Native AMP
37
+
38
+ ### Training results
39
+
40
+
41
+ ### Framework versions
42
+ - Transformers 4.36.2
43
+ - Pytorch 2.1.2+cu121
44
+ - Datasets 2.0.0
45
+ - Tokenizers 0.15.0