Kirili4ik commited on
Commit
bb548ac
1 Parent(s): d203043

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -22
README.md CHANGED
@@ -1,54 +1,62 @@
1
  ---
2
  language:
3
  - rus
 
4
  tags:
5
  - mbart
6
  inference:
7
  parameters:
8
  no_repeat_ngram_size: 4,
9
- num_beams : 5
10
  datasets:
11
  - IlyaGusev/gazeta
12
  - samsum
13
  - samsum_(translated_into_Russian)
14
  widget:
15
- - text: |
16
  Джефф: Могу ли я обучить модель 🤗 Transformers на Amazon SageMaker?
17
- Филипп: Конечно, вы можете использовать новый контейнер для глубокого обучения HuggingFace.
 
 
 
18
  Джефф: Хорошо.
 
19
  Джефф: и как я могу начать?
 
20
  Джефф: где я могу найти документацию?
21
- Филипп: ок, ок, здесь можно найти все: https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
22
 
 
 
23
  model-index:
24
- - name: "mbart_ruDialogSum"
25
  results:
26
- - task:
27
  name: Abstractive Dialogue Summarization
28
- type: abstractive-text-summarization
29
  dataset:
30
- name: "SAMSum Corpus (translated to Russian)"
31
  type: samsum
32
  metrics:
33
- - name: Validation ROGUE-1
34
- type: rogue-1
35
- value: 34.5
36
- - name: Validation ROGUE-L
37
- type: rogue-l
38
- value: 33
39
- - name: Test ROGUE-1
40
- type: rogue-1
41
- value: 31
42
- - name: Test ROGUE-L
43
- type: rogue-l
44
- value: 28
 
45
  ---
46
  ### 📝 Description
47
 
48
  MBart for Russian summarization fine-tuned for **dialogues** summarization.
49
 
50
 
51
- This model was firstly fine-tuned by [Ilya Gusev](https://hf.co/IlyaGusev) on [Gazeta dataset](https://huggingface.co/datasets/IlyaGusev/gazeta). We have **fine tuned** that model on [SamSum dataset]() **translated to Russian** using GoogleTranslateAPI
52
 
53
  🤗 Moreover! We have implemented a **! telegram bot [@summarization_bot](https://t.me/summarization_bot) !** with the inference of this model. Add it to the chat and get summaries instead of dozens spam messages!  🤗
54
 
@@ -83,4 +91,4 @@ output_ids = model.generate(
83
 
84
  summary = tokenizer.decode(output_ids, skip_special_tokens=True)
85
  print(summary)
86
- ```
 
1
  ---
2
  language:
3
  - rus
4
+ - ru
5
  tags:
6
  - mbart
7
  inference:
8
  parameters:
9
  no_repeat_ngram_size: 4,
10
+ num_beams: 5
11
  datasets:
12
  - IlyaGusev/gazeta
13
  - samsum
14
  - samsum_(translated_into_Russian)
15
  widget:
16
+ - text: >
17
  Джефф: Могу ли я обучить модель 🤗 Transformers на Amazon SageMaker?
18
+
19
+ Филипп: Конечно, вы можете использовать новый контейнер для глубокого
20
+ обучения HuggingFace.
21
+
22
  Джефф: Хорошо.
23
+
24
  Джефф: и как я могу начать?
25
+
26
  Джефф: где я могу найти документацию?
 
27
 
28
+ Филипп: ок, ок, здесь можно найти все:
29
+ https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
30
  model-index:
31
+ - name: mbart_ruDialogSum
32
  results:
33
+ - task:
34
  name: Abstractive Dialogue Summarization
35
+ type: abstractive-text-summarization
36
  dataset:
37
+ name: SAMSum Corpus (translated to Russian)
38
  type: samsum
39
  metrics:
40
+ - name: Validation ROGUE-1
41
+ type: rogue-1
42
+ value: 34.5
43
+ - name: Validation ROGUE-L
44
+ type: rogue-l
45
+ value: 33
46
+ - name: Test ROGUE-1
47
+ type: rogue-1
48
+ value: 31
49
+ - name: Test ROGUE-L
50
+ type: rogue-l
51
+ value: 28
52
+ license: cc
53
  ---
54
  ### 📝 Description
55
 
56
  MBart for Russian summarization fine-tuned for **dialogues** summarization.
57
 
58
 
59
+ This model was firstly fine-tuned by [Ilya Gusev](https://hf.co/IlyaGusev) on [Gazeta dataset](https://huggingface.co/datasets/IlyaGusev/gazeta). We have **fine tuned** that model on [SamSum dataset](https://huggingface.co/datasets/samsum) **translated to Russian** using GoogleTranslateAPI
60
 
61
  🤗 Moreover! We have implemented a **! telegram bot [@summarization_bot](https://t.me/summarization_bot) !** with the inference of this model. Add it to the chat and get summaries instead of dozens spam messages!  🤗
62
 
 
91
 
92
  summary = tokenizer.decode(output_ids, skip_special_tokens=True)
93
  print(summary)
94
+ ```