Spaces:
Runtime error
Runtime error
Update app.py
Browse files
app.py
CHANGED
@@ -76,12 +76,12 @@ with gr.Blocks() as demo:
|
|
76 |
|
77 |
with gr.TabItem("The Summarization Engine"):
|
78 |
gr.Markdown("""
|
79 |
-
<h3>Abstractive vs Extractive
|
80 |
<p>
|
81 |
Abstractive
|
82 |
The underlying engines for the Abstractive part are transformer based model BART, a sequence-to-sequence model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. The BART-model was pre-trained by KBLab/bart-base-swedish-cased (link) to learn general knowledge about language. Afterwards, the model was further fine-tuned on two labelled datasets that have been open-sourced:
|
83 |
-
Gabriel/cnn_daily_swe (link)
|
84 |
-
Gabriel/xsum_swe (link)
|
85 |
|
86 |
To see more in depth regarding the training go to link.
|
87 |
|
|
|
76 |
|
77 |
with gr.TabItem("The Summarization Engine"):
|
78 |
gr.Markdown("""
|
79 |
+
<h3>Abstractive vs Extractive</h3>
|
80 |
<p>
|
81 |
Abstractive
|
82 |
The underlying engines for the Abstractive part are transformer based model BART, a sequence-to-sequence model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. The BART-model was pre-trained by KBLab/bart-base-swedish-cased (link) to learn general knowledge about language. Afterwards, the model was further fine-tuned on two labelled datasets that have been open-sourced:
|
83 |
+
- Gabriel/cnn_daily_swe (link)
|
84 |
+
- Gabriel/xsum_swe (link)
|
85 |
|
86 |
To see more in depth regarding the training go to link.
|
87 |
|