--- license: mit base_model: facebook/bart-large-cnn tags: - generated_from_trainer datasets: - fedora-copr/pep-sum metrics: - rouge model-index: - name: pep_summarization results: - task: name: Summarization type: summarization dataset: name: fedora-copr/pep-sum type: fedora-copr/pep-sum metrics: - name: Rouge1 type: rouge value: 75.3806 --- # pep_summarization This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on the fedora-copr/pep-sum dataset. It achieves the following results on the evaluation set: - Loss: 0.1242 - Rouge1: 75.3806 - Rouge2: 74.6735 - Rougel: 75.5866 - Rougelsum: 75.5446 - Gen Len: 85.3188 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:| | No log | 1.0 | 69 | 0.0957 | 72.6601 | 71.6824 | 72.6858 | 72.4668 | 95.4493 | | No log | 2.0 | 138 | 0.1345 | 75.0063 | 74.0782 | 75.0597 | 74.8943 | 92.0145 | | No log | 3.0 | 207 | 0.1412 | 75.3012 | 74.5492 | 75.4246 | 75.324 | 85.4638 | | No log | 4.0 | 276 | 0.1089 | 74.8426 | 74.0317 | 74.8939 | 74.8128 | 85.0435 | | No log | 5.0 | 345 | 0.1242 | 75.3806 | 74.6735 | 75.5866 | 75.5446 | 85.3188 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.1.2+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0