model update
Browse files
README.md
CHANGED
@@ -53,14 +53,14 @@ This model is fine-tuned version of [facebook/bart-base](https://huggingface.co/
|
|
53 |
[lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
|
54 |
This model is fine-tuned without answer information, i.e. generate a question only given a paragraph (note that normal model is fine-tuned to generate a question given a pargraph and an associated answer in the paragraph).
|
55 |
|
56 |
-
Please cite our paper if you use the model ([
|
57 |
|
58 |
```
|
59 |
|
60 |
@inproceedings{ushio-etal-2022-generative,
|
61 |
-
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration
|
62 |
author = "Ushio, Asahi and
|
63 |
-
Alva-Manchego, Fernando
|
64 |
Camacho-Collados, Jose",
|
65 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
66 |
month = dec,
|
@@ -77,17 +77,27 @@ Please cite our paper if you use the model ([TBA](TBA)).
|
|
77 |
- **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
|
78 |
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
|
79 |
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
|
80 |
-
- **Paper:** [
|
81 |
|
82 |
### Usage
|
|
|
83 |
```python
|
84 |
|
85 |
-
from
|
|
|
|
|
|
|
|
|
|
|
|
|
86 |
|
87 |
-
|
88 |
-
|
89 |
|
90 |
-
|
|
|
|
|
|
|
91 |
question = pipe('<hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl>')
|
92 |
|
93 |
```
|
@@ -126,11 +136,12 @@ The following hyperparameters were used during fine-tuning:
|
|
126 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-base-squad-no-answer/raw/main/trainer_config.json).
|
127 |
|
128 |
## Citation
|
|
|
129 |
|
130 |
@inproceedings{ushio-etal-2022-generative,
|
131 |
-
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration
|
132 |
author = "Ushio, Asahi and
|
133 |
-
Alva-Manchego, Fernando
|
134 |
Camacho-Collados, Jose",
|
135 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
136 |
month = dec,
|
@@ -139,3 +150,4 @@ The full configuration can be found at [fine-tuning config file](https://hugging
|
|
139 |
publisher = "Association for Computational Linguistics",
|
140 |
}
|
141 |
|
|
|
|
53 |
[lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
|
54 |
This model is fine-tuned without answer information, i.e. generate a question only given a paragraph (note that normal model is fine-tuned to generate a question given a pargraph and an associated answer in the paragraph).
|
55 |
|
56 |
+
Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
|
57 |
|
58 |
```
|
59 |
|
60 |
@inproceedings{ushio-etal-2022-generative,
|
61 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
|
62 |
author = "Ushio, Asahi and
|
63 |
+
Alva-Manchego, Fernando and
|
64 |
Camacho-Collados, Jose",
|
65 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
66 |
month = dec,
|
|
|
77 |
- **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
|
78 |
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
|
79 |
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
|
80 |
+
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
|
81 |
|
82 |
### Usage
|
83 |
+
- With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
|
84 |
```python
|
85 |
|
86 |
+
from lmqg import TransformersQG
|
87 |
+
# initialize model
|
88 |
+
model = TransformersQG(language='en', model='lmqg/bart-base-squad-no-answer')
|
89 |
+
# model prediction
|
90 |
+
question = model.generate_q(list_context=["William Turner was an English painter who specialised in watercolour landscapes"], list_answer=["William Turner"])
|
91 |
+
|
92 |
+
```
|
93 |
|
94 |
+
- With `transformers`
|
95 |
+
```python
|
96 |
|
97 |
+
from transformers import pipeline
|
98 |
+
# initialize model
|
99 |
+
pipe = pipeline("text2text-generation", 'lmqg/bart-base-squad-no-answer')
|
100 |
+
# question generation
|
101 |
question = pipe('<hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl>')
|
102 |
|
103 |
```
|
|
|
136 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-base-squad-no-answer/raw/main/trainer_config.json).
|
137 |
|
138 |
## Citation
|
139 |
+
```
|
140 |
|
141 |
@inproceedings{ushio-etal-2022-generative,
|
142 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
|
143 |
author = "Ushio, Asahi and
|
144 |
+
Alva-Manchego, Fernando and
|
145 |
Camacho-Collados, Jose",
|
146 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
147 |
month = dec,
|
|
|
150 |
publisher = "Association for Computational Linguistics",
|
151 |
}
|
152 |
|
153 |
+
```
|