File size: 3,892 Bytes
ec9d182
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112

---
license: cc-by-4.0
metrics:
- bleu4
- meteor
- rouge-l
- bertscore
- moverscore
language: en
datasets:
- lmqg/qg_subjqa
pipeline_tag: text2text-generation
tags:
- question generation
widget:
- text: "generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
  example_title: "Question Generation Example 1" 
- text: "generate question: Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
  example_title: "Question Generation Example 2" 
- text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic,  <hl> Cadillac Records <hl> ."
  example_title: "Question Generation Example 3" 
model-index:
- name: lmqg/bart-large-subjqa-electronics
  results:
  - task:
      name: Text2text Generation
      type: text2text-generation
    dataset:
      name: lmqg/qg_subjqa
      type: electronics
      args: electronics
    metrics:
    - name: BLEU4
      type: bleu4
      value: 0.051782881162838426
    - name: ROUGE-L
      type: rouge-l
      value: 0.2886833117152989
    - name: METEOR
      type: meteor
      value: 0.25170852692044277
    - name: BERTScore
      type: bertscore
      value: 0.9351121607948752
    - name: MoverScore
      type: moverscore
      value: 0.6568060756261695
---

# Language Models Fine-tuning on Question Generation: `lmqg/bart-large-subjqa-electronics`
This model is fine-tuned version of [lmqg/bart-large-squad](https://huggingface.co/lmqg/bart-large-squad) for question generation task on the 
[lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (dataset_name: electronics).
This model is continuously fine-tuned with [lmqg/bart-large-squad](https://huggingface.co/lmqg/bart-large-squad).

### Overview
- **Language model:** [lmqg/bart-large-squad](https://huggingface.co/lmqg/bart-large-squad)   
- **Language:** en  
- **Training data:** [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (electronics)
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [TBA](TBA)

### Usage
```python

from transformers import pipeline

model_path = 'lmqg/bart-large-subjqa-electronics'
pipe = pipeline("text2text-generation", model_path)

# Question Generation
input_text = 'generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.'
question = pipe(input_text)
```

## Evaluation Metrics


### Metrics

| Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
|:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
| [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | electronics | 0.051782881162838426 | 0.2886833117152989 | 0.25170852692044277 | 0.9351121607948752 | 0.6568060756261695 | [link](https://huggingface.co/lmqg/bart-large-subjqa-electronics/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.electronics.json) | 




## Training hyperparameters

The following hyperparameters were used during fine-tuning:
 - dataset_path: lmqg/qg_subjqa
 - dataset_name: electronics
 - input_types: ['paragraph_answer']
 - output_types: ['question']
 - prefix_types: None
 - model: lmqg/bart-large-squad
 - max_length: 512
 - max_length_output: 32
 - epoch: 4
 - batch: 8
 - lr: 5e-05
 - fp16: False
 - random_seed: 1
 - gradient_accumulation_steps: 8
 - label_smoothing: 0.15

The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-large-subjqa-electronics/raw/main/trainer_config.json).

## Citation
TBA