Edit model card

jjpetrisko/authentiface_v1.0f

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 13356700, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Epoch Train Loss Validation Loss Validation Accuracy
1 0.3307 0.2049 0.9166
2 0.1693 0.1199 0.9548
3 0.1128 0.0903 0.9658
4 0.0825 0.0644 0.9768
5 0.0641 0.0539 0.9805
6 0.0520 0.0560 0.9797
7 0.0406 0.0401 0.9855
8 0.0345 0.0343 0.9876
9 0.0304 0.0302 0.9894
10 0.0248 0.0300 0.9897
11 0.0222 0.0290 0.9894
12 0.0202 0.0329 0.9882
13 0.0175 0.0276 0.9909
14 0.0162 0.0263 0.9921
15 0.0154 0.0273 0.9913
16 0.0139 0.0236 0.9917
17 0.0125 0.0500 0.9842
18 0.0109 0.0236 0.9919
19 0.0096 0.0332 0.9894
20 0.0097 0.0272 0.9923
21 0.0092 0.0240 0.9931
22 0.0086 0.0254 0.9923
23 0.0079 0.0287 0.9916
24 0.0076 0.0338 0.9900
25 0.0070 0.0254 0.9926
26 0.0066 0.0266 0.9918
27 0.0065 0.0284 0.9925
28 0.0063 0.0268 0.9927

Framework versions

  • Transformers 4.38.2
  • TensorFlow 2.15.0
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.