Lagadro commited on
Commit
95afead
1 Parent(s): 93b28ca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -8
README.md CHANGED
@@ -18,20 +18,15 @@ It achieves the following results on the evaluation set:
18
  - Train Loss: 0.0812
19
  - Validation Loss: 0.0961
20
  - Epoch: 4
 
21
 
22
  ## Model description
23
 
24
- More information needed
25
-
26
- ## Intended uses & limitations
27
-
28
- More information needed
29
 
30
  ## Training and evaluation data
31
 
32
- More information needed
33
-
34
- ## Training procedure
35
 
36
  ### Training hyperparameters
37
 
@@ -39,6 +34,27 @@ The following hyperparameters were used during training:
39
  - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 260, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
40
  - training_precision: float32
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
  ### Training results
43
 
44
  | Train Loss | Validation Loss | Epoch |
 
18
  - Train Loss: 0.0812
19
  - Validation Loss: 0.0961
20
  - Epoch: 4
21
+ - Overall Accuracy: 0.968
22
 
23
  ## Model description
24
 
25
+ This model is trained to extract entity names from reports of mammographic images.
 
 
 
 
26
 
27
  ## Training and evaluation data
28
 
29
+ The data has been provided by Teknofest.
 
 
30
 
31
  ### Training hyperparameters
32
 
 
34
  - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 260, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
35
  - training_precision: float32
36
 
37
+ - Name: AdamWeightDecay
38
+ - Learning Rate:
39
+ - Module: keras.optimizers.schedules
40
+ - Class: PolynomialDecay
41
+ - Config:
42
+ - Initial Learning Rate: 2e-05
43
+ - Decay Steps: 260
44
+ - End Learning Rate: 0.0
45
+ - Power: 1.0
46
+ - Cycle: False
47
+ - Name: None
48
+ - Registered Name: None
49
+ - Decay: 0.0
50
+ - Beta 1: 0.9
51
+ - Beta 2: 0.999
52
+ - Epsilon: 1e-08
53
+ - Amsgrad: False
54
+ - Weight Decay Rate: 0.01
55
+ - Training Precision: float32
56
+
57
+
58
  ### Training results
59
 
60
  | Train Loss | Validation Loss | Epoch |