wenxxx commited on
Commit
da8dd1a
1 Parent(s): 07d4786

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -0
README.md CHANGED
@@ -1,3 +1,28 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+
5
+ HF-version model for PRIMERA: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization (ACL 2022).
6
+
7
+ The original code can be found [here](https://github.com/allenai/PRIMER). You can find the script and notebook to train/evaluate the model in the original github repo.
8
+
9
+ * Note: due to the difference between the implementations of the original Longformer and the Huggingface LED model, the results of converted models are slightly different. We run a sanity check on both fine-tuned and non fine-tuned models, and show the results below:
10
+
11
+ | Model | Rouge-1 | Rouge-2 | Rouge-L |
12
+ | --- | ----------- |----------- |----------- |
13
+ | PRIMERA | 42.0 | 13.6 | 20.8|
14
+ | PRIMERA-hf | 41.7 |13.6 | 20.5|
15
+ | PRIMERA(finetuned) | 49.9 | 21.1 | 25.9|
16
+ | PRIMERA-hf(finetuned) | 49.9 | 20.9 | 25.8|
17
+
18
+ You can use it by
19
+ ```
20
+ from transformers import (
21
+ AutoTokenizer,
22
+ LEDConfig,
23
+ LEDForConditionalGeneration,
24
+ )
25
+ tokenizer = AutoTokenizer.from_pretrained('allenai/PRIMERA')
26
+ config=LEDConfig.from_pretrained('allenai/PRIMERA')
27
+ model = LEDForConditionalGeneration.from_pretrained('allenai/PRIMERA')
28
+ ```