--- library_name: transformers license: mit base_model: microsoft/layoutlm-base-uncased tags: - generated_from_keras_callback model-index: - name: layoutlm-SROIE-tf results: [] --- # layoutlm-SROIE-tf This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0151 - Validation Loss: 0.0289 - Train Overall Precision: 0.9110 - Train Overall Recall: 0.9438 - Train Overall F1: 0.9271 - Train Overall Accuracy: 0.9942 - Epoch: 7 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: mixed_float16 ### Training results | Train Loss | Validation Loss | Train Overall Precision | Train Overall Recall | Train Overall F1 | Train Overall Accuracy | Epoch | |:----------:|:---------------:|:-----------------------:|:--------------------:|:----------------:|:----------------------:|:-----:| | 0.4238 | 0.1257 | 0.7483 | 0.6535 | 0.6977 | 0.9684 | 0 | | 0.0943 | 0.0549 | 0.9101 | 0.7075 | 0.7961 | 0.9863 | 1 | | 0.0500 | 0.0403 | 0.8310 | 0.8573 | 0.8440 | 0.9891 | 2 | | 0.0344 | 0.0344 | 0.8468 | 0.9042 | 0.8746 | 0.9916 | 3 | | 0.0282 | 0.0346 | 0.8417 | 0.9344 | 0.8856 | 0.9918 | 4 | | 0.0222 | 0.0296 | 0.9155 | 0.9207 | 0.9181 | 0.9938 | 5 | | 0.0168 | 0.0328 | 0.8756 | 0.9431 | 0.9081 | 0.9931 | 6 | | 0.0151 | 0.0289 | 0.9110 | 0.9438 | 0.9271 | 0.9942 | 7 | ### Framework versions - Transformers 4.48.3 - TensorFlow 2.18.0 - Datasets 3.3.2 - Tokenizers 0.21.0