bart-legal-base-es / README.md
mrm8488's picture
Create README.md
154f8d6
|
raw
history blame
1.55 kB
metadata
language: es
tags:
  - Spanish
  - BART
  - Legal
datasets:
  - Spanish-legal-corpora

BART Legal Spanish ⚖️

BART Legal (base) is an BART-like model trained on A collection of corpora of Spanish legal domain.

BART is a transformer encoder-decoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function and (2) learning a model to reconstruct the original text.

This model is particularly effective when fine-tuned for text generation tasks (e.g., summarization, translation) but also works well for comprehension tasks (e.g., text classification, question answering).

Training details

TBA

Model details ⚙

TBA

Evaluation metrics (for discriminator) 🧾

Metric # Score
Accuracy 0.955
Precision 0.790
AUC 0.971

Benchmarks 🔨

WIP 🚧

How to use with transformers

TBA

Acknowledgments

TBA

Citation

If you want to cite this model, you can use this:

@misc {manuel_romero_2023,
    author       = { {Manuel Romero} },
    title        = { bart-legal-base-es (Revision 27bc9e1) },
    year         = 2023,
    url          = { https://huggingface.co/mrm8488/bart-legal-base-es },
    doi          = { 10.57967/hf/0467 },
    publisher    = { Hugging Face }
}

Created by Manuel Romero/@mrm8488

Made with in Spain