datien228 commited on
Commit
fe2d5db
·
1 Parent(s): 2cd4b7a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -1
README.md CHANGED
@@ -1,7 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
  #### Pre-trained BART Model fine-tune on WikiLingua dataset
2
- The repository for storing the fine-tuned BART model (by sshleifer) using the WikiLingua dataset (English)
3
 
4
  **Purpose:** Examine the performance of a fine-tuned model research purposes
 
5
  **Observation:**
6
  - Pre-trained model was trained on the XSum dataset, which summarize a not-too-long documents into one-liner summary
7
  - Fine-tuning this model using WikiLingua is appropriate since the summaries for that dataset are also short
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - summarization
6
+ license: mit
7
+ datasets:
8
+ - wikilingua
9
+ metrics:
10
+ - rouge
11
+ ---
12
+
13
  #### Pre-trained BART Model fine-tune on WikiLingua dataset
14
+ The repository for the fine-tuned BART model (by sshleifer) using the **wikilingua** dataset (English)
15
 
16
  **Purpose:** Examine the performance of a fine-tuned model research purposes
17
+
18
  **Observation:**
19
  - Pre-trained model was trained on the XSum dataset, which summarize a not-too-long documents into one-liner summary
20
  - Fine-tuning this model using WikiLingua is appropriate since the summaries for that dataset are also short