mrm8488 commited on
Commit
a1a7347
1 Parent(s): 56c78ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -5
README.md CHANGED
@@ -14,7 +14,7 @@ datasets:
14
 
15
  **BART Legal Spanish** (base) is a BART-like model trained on [A collection of corpora of Spanish legal domain](https://zenodo.org/record/5495529#.YZItp3vMLJw).
16
 
17
- BART is a transformer encoder-decoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function and (2) learning a model to reconstruct the original text.
18
 
19
  This model is particularly effective when fine-tuned for text generation tasks (e.g., summarization, translation) but also works well for comprehension tasks (e.g., text classification, question answering).
20
 
@@ -24,9 +24,6 @@ This model is particularly effective when fine-tuned for text generation tasks (
24
 
25
  TBA
26
 
27
- ## Model details ⚙
28
-
29
- TBA
30
 
31
  ## [Evaluation metrics](https://huggingface.co/mrm8488/bart-legal-base-es/tensorboard?params=scalars#frame) 🧾
32
 
@@ -46,7 +43,9 @@ TBA
46
 
47
  ## Acknowledgments
48
 
49
- TBA
 
 
50
 
51
  ## Citation
52
  If you want to cite this model, you can use this:
 
14
 
15
  **BART Legal Spanish** (base) is a BART-like model trained on [A collection of corpora of Spanish legal domain](https://zenodo.org/record/5495529#.YZItp3vMLJw).
16
 
17
+ BART is a transformer *encoder-decoder* (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function and (2) learning a model to reconstruct the original text.
18
 
19
  This model is particularly effective when fine-tuned for text generation tasks (e.g., summarization, translation) but also works well for comprehension tasks (e.g., text classification, question answering).
20
 
 
24
 
25
  TBA
26
 
 
 
 
27
 
28
  ## [Evaluation metrics](https://huggingface.co/mrm8488/bart-legal-base-es/tensorboard?params=scalars#frame) 🧾
29
 
 
43
 
44
  ## Acknowledgments
45
 
46
+ - [Narrativa](https://www.narrativa.com/)
47
+ - [QBlocks](https://www.qblocks.cloud/)
48
+ - [jarvislabs](https://jarvislabs.ai/)
49
 
50
  ## Citation
51
  If you want to cite this model, you can use this: