--- license: apache-2.0 base_model: google/flan-t5-large tags: - generated_from_trainer model-index: - name: flan-t5-large-finetuned-prompt_generation results: [] --- # flan-t5-large-finetuned-prompt_generation This model is a fine-tuned version of [google/flan-t5-large](https://huggingface.co/google/flan-t5-large) on the None dataset. It achieves the following results on the evaluation set: - Loss: nan - Map: 0.1581 - Ndcg@10: 0.4932 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Ndcg@10 | |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:| | No log | 1.0 | 4 | nan | 0.1581 | 0.4932 | | No log | 2.0 | 8 | nan | 0.1581 | 0.4932 | | No log | 3.0 | 12 | nan | 0.1581 | 0.4932 | | No log | 4.0 | 16 | nan | 0.1581 | 0.4932 | | No log | 5.0 | 20 | nan | 0.1581 | 0.4932 | | No log | 6.0 | 24 | nan | 0.1581 | 0.4932 | | No log | 7.0 | 28 | nan | 0.1581 | 0.4932 | | No log | 8.0 | 32 | nan | 0.1581 | 0.4932 | | No log | 9.0 | 36 | nan | 0.1581 | 0.4932 | | No log | 10.0 | 40 | nan | 0.1581 | 0.4932 | | No log | 11.0 | 44 | nan | 0.1581 | 0.4932 | | No log | 12.0 | 48 | nan | 0.1581 | 0.4932 | | No log | 13.0 | 52 | nan | 0.1581 | 0.4932 | | No log | 14.0 | 56 | nan | 0.1581 | 0.4932 | | No log | 15.0 | 60 | nan | 0.1581 | 0.4932 | | No log | 16.0 | 64 | nan | 0.1581 | 0.4932 | | No log | 17.0 | 68 | nan | 0.1581 | 0.4932 | | No log | 18.0 | 72 | nan | 0.1581 | 0.4932 | | No log | 19.0 | 76 | nan | 0.1581 | 0.4932 | | No log | 20.0 | 80 | nan | 0.1581 | 0.4932 | | No log | 21.0 | 84 | nan | 0.1581 | 0.4932 | | No log | 22.0 | 88 | nan | 0.1581 | 0.4932 | | No log | 23.0 | 92 | nan | 0.1581 | 0.4932 | | No log | 24.0 | 96 | nan | 0.1581 | 0.4932 | | No log | 25.0 | 100 | nan | 0.1581 | 0.4932 | | No log | 26.0 | 104 | nan | 0.1581 | 0.4932 | | No log | 27.0 | 108 | nan | 0.1581 | 0.4932 | | No log | 28.0 | 112 | nan | 0.1581 | 0.4932 | | No log | 29.0 | 116 | nan | 0.1581 | 0.4932 | | No log | 30.0 | 120 | nan | 0.1581 | 0.4932 | | No log | 31.0 | 124 | nan | 0.1581 | 0.4932 | | No log | 32.0 | 128 | nan | 0.1581 | 0.4932 | | No log | 33.0 | 132 | nan | 0.1581 | 0.4932 | | No log | 34.0 | 136 | nan | 0.1581 | 0.4932 | | No log | 35.0 | 140 | nan | 0.1581 | 0.4932 | | No log | 36.0 | 144 | nan | 0.1581 | 0.4932 | | No log | 37.0 | 148 | nan | 0.1581 | 0.4932 | | No log | 38.0 | 152 | nan | 0.1581 | 0.4932 | | No log | 39.0 | 156 | nan | 0.1581 | 0.4932 | | No log | 40.0 | 160 | nan | 0.1581 | 0.4932 | | No log | 41.0 | 164 | nan | 0.1581 | 0.4932 | | No log | 42.0 | 168 | nan | 0.1581 | 0.4932 | | No log | 43.0 | 172 | nan | 0.1581 | 0.4932 | | No log | 44.0 | 176 | nan | 0.1581 | 0.4932 | | No log | 45.0 | 180 | nan | 0.1581 | 0.4932 | | No log | 46.0 | 184 | nan | 0.1581 | 0.4932 | | No log | 47.0 | 188 | nan | 0.1581 | 0.4932 | | No log | 48.0 | 192 | nan | 0.1581 | 0.4932 | | No log | 49.0 | 196 | nan | 0.1581 | 0.4932 | | No log | 50.0 | 200 | nan | 0.1581 | 0.4932 | | No log | 51.0 | 204 | nan | 0.1581 | 0.4932 | | No log | 52.0 | 208 | nan | 0.1581 | 0.4932 | | No log | 53.0 | 212 | nan | 0.1581 | 0.4932 | | No log | 54.0 | 216 | nan | 0.1581 | 0.4932 | | No log | 55.0 | 220 | nan | 0.1581 | 0.4932 | | No log | 56.0 | 224 | nan | 0.1581 | 0.4932 | | No log | 57.0 | 228 | nan | 0.1581 | 0.4932 | | No log | 58.0 | 232 | nan | 0.1581 | 0.4932 | | No log | 59.0 | 236 | nan | 0.1581 | 0.4932 | | No log | 60.0 | 240 | nan | 0.1581 | 0.4932 | | No log | 61.0 | 244 | nan | 0.1581 | 0.4932 | | No log | 62.0 | 248 | nan | 0.1581 | 0.4932 | | No log | 63.0 | 252 | nan | 0.1581 | 0.4932 | | No log | 64.0 | 256 | nan | 0.1581 | 0.4932 | | No log | 65.0 | 260 | nan | 0.1581 | 0.4932 | | No log | 66.0 | 264 | nan | 0.1581 | 0.4932 | | No log | 67.0 | 268 | nan | 0.1581 | 0.4932 | | No log | 68.0 | 272 | nan | 0.1581 | 0.4932 | | No log | 69.0 | 276 | nan | 0.1581 | 0.4932 | | No log | 70.0 | 280 | nan | 0.1581 | 0.4932 | | No log | 71.0 | 284 | nan | 0.1581 | 0.4932 | | No log | 72.0 | 288 | nan | 0.1581 | 0.4932 | | No log | 73.0 | 292 | nan | 0.1581 | 0.4932 | | No log | 74.0 | 296 | nan | 0.1581 | 0.4932 | | No log | 75.0 | 300 | nan | 0.1581 | 0.4932 | | No log | 76.0 | 304 | nan | 0.1581 | 0.4932 | | No log | 77.0 | 308 | nan | 0.1581 | 0.4932 | | No log | 78.0 | 312 | nan | 0.1581 | 0.4932 | | No log | 79.0 | 316 | nan | 0.1581 | 0.4932 | | No log | 80.0 | 320 | nan | 0.1581 | 0.4932 | | No log | 81.0 | 324 | nan | 0.1581 | 0.4932 | | No log | 82.0 | 328 | nan | 0.1581 | 0.4932 | | No log | 83.0 | 332 | nan | 0.1581 | 0.4932 | | No log | 84.0 | 336 | nan | 0.1581 | 0.4932 | | No log | 85.0 | 340 | nan | 0.1581 | 0.4932 | | No log | 86.0 | 344 | nan | 0.1581 | 0.4932 | | No log | 87.0 | 348 | nan | 0.1581 | 0.4932 | | No log | 88.0 | 352 | nan | 0.1581 | 0.4932 | | No log | 89.0 | 356 | nan | 0.1581 | 0.4932 | | No log | 90.0 | 360 | nan | 0.1581 | 0.4932 | | No log | 91.0 | 364 | nan | 0.1581 | 0.4932 | | No log | 92.0 | 368 | nan | 0.1581 | 0.4932 | | No log | 93.0 | 372 | nan | 0.1581 | 0.4932 | | No log | 94.0 | 376 | nan | 0.1581 | 0.4932 | | No log | 95.0 | 380 | nan | 0.1581 | 0.4932 | | No log | 96.0 | 384 | nan | 0.1581 | 0.4932 | | No log | 97.0 | 388 | nan | 0.1581 | 0.4932 | | No log | 98.0 | 392 | nan | 0.1581 | 0.4932 | | No log | 99.0 | 396 | nan | 0.1581 | 0.4932 | | No log | 100.0 | 400 | nan | 0.1581 | 0.4932 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1