model update
Browse files
README.md
CHANGED
@@ -53,14 +53,14 @@ This model is fine-tuned version of [google/mt5-base](https://huggingface.co/goo
|
|
53 |
[lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
|
54 |
|
55 |
|
56 |
-
Please cite our paper if you use the model ([
|
57 |
|
58 |
```
|
59 |
|
60 |
@inproceedings{ushio-etal-2022-generative,
|
61 |
-
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration
|
62 |
author = "Ushio, Asahi and
|
63 |
-
Alva-Manchego, Fernando
|
64 |
Camacho-Collados, Jose",
|
65 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
66 |
month = dec,
|
@@ -77,17 +77,27 @@ Please cite our paper if you use the model ([TBA](TBA)).
|
|
77 |
- **Training data:** [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) (default)
|
78 |
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
|
79 |
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
|
80 |
-
- **Paper:** [
|
81 |
|
82 |
### Usage
|
|
|
83 |
```python
|
84 |
|
85 |
-
from
|
|
|
|
|
|
|
|
|
|
|
|
|
86 |
|
87 |
-
|
88 |
-
|
89 |
|
90 |
-
|
|
|
|
|
|
|
91 |
question = pipe('<hl> Dopo il 1971 <hl> , l' OPEC ha tardato ad adeguare i prezzi per riflettere tale deprezzamento.')
|
92 |
|
93 |
```
|
@@ -126,11 +136,12 @@ The following hyperparameters were used during fine-tuning:
|
|
126 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-base-itquad/raw/main/trainer_config.json).
|
127 |
|
128 |
## Citation
|
|
|
129 |
|
130 |
@inproceedings{ushio-etal-2022-generative,
|
131 |
-
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration
|
132 |
author = "Ushio, Asahi and
|
133 |
-
Alva-Manchego, Fernando
|
134 |
Camacho-Collados, Jose",
|
135 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
136 |
month = dec,
|
@@ -139,3 +150,4 @@ The full configuration can be found at [fine-tuning config file](https://hugging
|
|
139 |
publisher = "Association for Computational Linguistics",
|
140 |
}
|
141 |
|
|
|
|
53 |
[lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
|
54 |
|
55 |
|
56 |
+
Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
|
57 |
|
58 |
```
|
59 |
|
60 |
@inproceedings{ushio-etal-2022-generative,
|
61 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
|
62 |
author = "Ushio, Asahi and
|
63 |
+
Alva-Manchego, Fernando and
|
64 |
Camacho-Collados, Jose",
|
65 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
66 |
month = dec,
|
|
|
77 |
- **Training data:** [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) (default)
|
78 |
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
|
79 |
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
|
80 |
+
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
|
81 |
|
82 |
### Usage
|
83 |
+
- With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
|
84 |
```python
|
85 |
|
86 |
+
from lmqg import TransformersQG
|
87 |
+
# initialize model
|
88 |
+
model = TransformersQG(language='it', model='lmqg/mt5-base-itquad')
|
89 |
+
# model prediction
|
90 |
+
question = model.generate_q(list_context=["Dopo il 1971 , l' OPEC ha tardato ad adeguare i prezzi per riflettere tale deprezzamento."], list_answer=["Dopo il 1971"])
|
91 |
+
|
92 |
+
```
|
93 |
|
94 |
+
- With `transformers`
|
95 |
+
```python
|
96 |
|
97 |
+
from transformers import pipeline
|
98 |
+
# initialize model
|
99 |
+
pipe = pipeline("text2text-generation", 'lmqg/mt5-base-itquad')
|
100 |
+
# question generation
|
101 |
question = pipe('<hl> Dopo il 1971 <hl> , l' OPEC ha tardato ad adeguare i prezzi per riflettere tale deprezzamento.')
|
102 |
|
103 |
```
|
|
|
136 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-base-itquad/raw/main/trainer_config.json).
|
137 |
|
138 |
## Citation
|
139 |
+
```
|
140 |
|
141 |
@inproceedings{ushio-etal-2022-generative,
|
142 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
|
143 |
author = "Ushio, Asahi and
|
144 |
+
Alva-Manchego, Fernando and
|
145 |
Camacho-Collados, Jose",
|
146 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
147 |
month = dec,
|
|
|
150 |
publisher = "Association for Computational Linguistics",
|
151 |
}
|
152 |
|
153 |
+
```
|