vanessbut commited on
Commit
69b4bb2
·
1 Parent(s): 2db81d0

Исправлен вывод.

Browse files
Files changed (1) hide show
  1. app.py +20 -3
app.py CHANGED
@@ -12,8 +12,25 @@ st.markdown("<p style=\"text-align:center\"><img width=700px src='https://c.teno
12
  #pipe = pipeline("ner", "Davlan/distilbert-base-multilingual-cased-ner-hrl")
13
 
14
  #st.markdown("#### Title:")
15
- title = st.text_area("Title:")
16
- abstract = st.text_area("abstract:")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
 
18
  from transformers import AutoModel, AutoTokenizer
19
  #from tqdm import tqdm as tqdm
@@ -37,7 +54,7 @@ os.system("python3 -m spacy download en")
37
  # Но этим займёмся потом, если будет время.
38
  main_nlp = spacy.load('en_core_web_sm')
39
 
40
- text = title + abstract
41
 
42
  if not text is None and len(text) > 0:
43
  #keywords = get_candidates(text, main_nlp)
 
12
  #pipe = pipeline("ner", "Davlan/distilbert-base-multilingual-cased-ner-hrl")
13
 
14
  #st.markdown("#### Title:")
15
+ title = st.text_area("Title:", value="How to cook a neural network", height=16, help="Title of the article")
16
+ abstract = st.text_area("Abstract:",
17
+ value="""
18
+ My dad fits hellish models in general.
19
+ Well, this is about an average recipe, because there are a lot of variations.
20
+ The model is taken, it is not finetuned, finetuning is not about my dad.
21
+ He takes this model, dumps it into the tensorboard and starts frying it.
22
+ Adds a huge amount of noize, convolutions, batch and spectral normalization DROPOUT! for regularization, maxpooling on top.
23
+ All this is fitted to smoke.
24
+ Then the computer is removed from the fire and cools on the balcony.
25
+ Then dad brings it in and generously sprinkles it with crossvalidation and starts predicting.
26
+ At the same time, he gets data from the web, scraping it with a fork.
27
+ Predicts and sentences in a half-whisper oh god.
28
+ At the same time, he has sweat on his forehead.
29
+ Kindly offers me sometimes, but I refuse.
30
+ Do I need to talk about what the wildest overfitting then?
31
+ The overfitting is such that the val loss peels off the walls.
32
+ """,
33
+ height=512, help="Abstract of the article")
34
 
35
  from transformers import AutoModel, AutoTokenizer
36
  #from tqdm import tqdm as tqdm
 
54
  # Но этим займёмся потом, если будет время.
55
  main_nlp = spacy.load('en_core_web_sm')
56
 
57
+ text = title + ". " + abstract
58
 
59
  if not text is None and len(text) > 0:
60
  #keywords = get_candidates(text, main_nlp)