alex-atelo's picture
Update README.md
1d4ef3f verified
metadata
license: apache-2.0
base_model: google/mt5-small
tags:
  - summarization
  - generated_from_trainer
language:
  - en
  - es
metrics:
  - rouge
datasets:
  - csebuetnlp/xlsum
model-index:
  - name: mt5-small-finetuned-xlsum-en-es
    results: []
widget:
  - text: >-
      The tower is 324 metres (1,063 ft) tall, about the same height as an
      81-storey building, and the tallest structure in Paris. Its base is
      square, measuring 125 metres (410 ft) on each side. During its
      construction, the Eiffel Tower surpassed the Washington Monument to become
      the tallest man-made structure in the world, a title it held for 41 years
      until the Chrysler Building in New York City was finished in 1930. It was
      the first structure to reach a height of 300 metres. Due to the addition
      of a broadcasting aerial at the top of the tower in 1957, it is now taller
      than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters,
      the Eiffel Tower is the second tallest free-standing structure in France
      after the Millau Viaduct.
    example_title: English Summary
  - text: >-
      La torre tiene 324 metros (1.063 pies) de altura, aproximadamente la misma
      altura que un edificio de 81 pisos, y la estructura más alta de París. Su
      base es cuadrada, mide 125 metros (410 pies) de lado. Durante su
      construcción, la Torre Eiffel superó al Monumento a Washington para
      convertirse en la estructura artificial más alta del mundo, un título que
      mantuvo durante 41 años hasta que se terminó el Edificio Chrysler en la
      ciudad de Nueva York en 1930. Fue la primera estructura en alcanzar una
      altura de 300 metros. Debido a la adición de una antena de transmisión en
      la parte superior de la torre en 1957, ahora es más alta que el Edificio
      Chrysler en 5,2 metros (17 pies). Excluyendo los transmisores, la Torre
      Eiffel es la segunda estructura independiente más alta de Francia después
      del viaducto de Millau.
    example_title: Spanish Summary

mt5-small-finetuned-xlsum-en-es

This model is a fine-tuned version of google/mt5-small on the csebuetnlp/xlsum dataset.

Reduced versions of the English/Spanish subsets were used, focusing on shorter targets.

It achieves the following results on the evaluation set:

  • Loss: 2.9483
  • Rouge1: 19.42
  • Rouge2: 4.44
  • Rougel: 16.7
  • Rougelsum: 16.7
  • Mean Len: 16.3231

Model description

More information needed

Intended uses & limitations

Model may produce false information when summarizing.

This is very much an initial draft, and is not expected for use in production, use at your own risk.

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Lead-3 Baseline:

  • Rouge1: 12.22
  • Rouge2: 2.01
  • RougeL: 9.02
  • RougeLsum: 10.33
Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Mean Len
6.7763 1.0 1237 3.1120 13.57 2.76 11.59 11.59 12.6116
4.1022 2.0 2474 2.9718 19.35 4.32 16.63 16.64 16.3084
3.9219 3.0 3711 2.9483 19.42 4.44 16.7 16.7 16.3231

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2

Citation

BibTeX:

@inproceedings{hasan-etal-2021-xl,
    title = "{XL}-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages",
    author = "Hasan, Tahmid  and
      Bhattacharjee, Abhik  and
      Islam, Md. Saiful  and
      Mubasshir, Kazi  and
      Li, Yuan-Fang  and
      Kang, Yong-Bin  and
      Rahman, M. Sohel  and
      Shahriyar, Rifat",
    booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
    month = aug,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.findings-acl.413",
    pages = "4693--4703",
}