Amitontheweb commited on
Commit
a2d21d9
1 Parent(s): a77933f

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +6 -6
app.py CHANGED
@@ -431,13 +431,13 @@ with gr.Blocks() as demo:
431
  Option 3: ! [probability score: 0.73]
432
 
433
 
434
- ### **Greedy Search**:
435
  Goes along the most well trodden path. Always picks up the next word/token carrying the highest probability score. Default for GPT2.
436
 
437
  In this illustrative example, since "!" has the highest probability, a greedy strategy will output: Today is a rainy day!
438
 
439
 
440
- ### **Random Sampling**:
441
  Picks up any random path or trail to walk on. Use ```do_sample=True```
442
 
443
  *Temperature* - Increasing the temperature allows words with lesser probabilities to show up in the output. At Temp = 0, search becomes 'greedy' for words with high probabilities.
@@ -449,7 +449,7 @@ with gr.Blocks() as demo:
449
  When used with temperature: Reducing temperature makes the search greedy.
450
 
451
 
452
- ### **Simple Beam search**:
453
  Selects the branches (beams) going towards other heavy laden branch of fruits, to find the heaviest set among the branches in all. Akin to greedy search, but finds the total heaviest or largest route.
454
 
455
  If num_beams = 2, every branch will divide into the top two scoring tokens at each step, and so on till the search ends.
@@ -457,13 +457,13 @@ with gr.Blocks() as demo:
457
  *Early Stopping*: Makes the search stop when a pre-determined criteria for ending the search is satisfied.
458
 
459
 
460
- ### **Diversity Beam search**:
461
  Divided beams into groups of beams, and applies the diversity penalty. This makes the output more diverse and interesting.
462
 
463
  *Group Diversity Penalty*: Used to instruct the next beam group to ignore the words/tokens already selected by previous groups.
464
 
465
 
466
- ### **Contrastive search**:
467
  Uses the entire input context to create more interesting outputs.
468
 
469
  *Penalty Alpha*: When α=0, search becomes greedy.
@@ -471,7 +471,7 @@ with gr.Blocks() as demo:
471
  Refer: https://huggingface.co/blog/introducing-csearch
472
 
473
 
474
- ### **Other parameters**
475
 
476
  - Length penalty: Used to force the model to meet the expected output length.
477
 
 
431
  Option 3: ! [probability score: 0.73]
432
 
433
 
434
+ ### 1. Greedy Search:
435
  Goes along the most well trodden path. Always picks up the next word/token carrying the highest probability score. Default for GPT2.
436
 
437
  In this illustrative example, since "!" has the highest probability, a greedy strategy will output: Today is a rainy day!
438
 
439
 
440
+ ### 2. Random Sampling:
441
  Picks up any random path or trail to walk on. Use ```do_sample=True```
442
 
443
  *Temperature* - Increasing the temperature allows words with lesser probabilities to show up in the output. At Temp = 0, search becomes 'greedy' for words with high probabilities.
 
449
  When used with temperature: Reducing temperature makes the search greedy.
450
 
451
 
452
+ ### 3. Simple Beam search:
453
  Selects the branches (beams) going towards other heavy laden branch of fruits, to find the heaviest set among the branches in all. Akin to greedy search, but finds the total heaviest or largest route.
454
 
455
  If num_beams = 2, every branch will divide into the top two scoring tokens at each step, and so on till the search ends.
 
457
  *Early Stopping*: Makes the search stop when a pre-determined criteria for ending the search is satisfied.
458
 
459
 
460
+ ### 4. Diversity Beam search:
461
  Divided beams into groups of beams, and applies the diversity penalty. This makes the output more diverse and interesting.
462
 
463
  *Group Diversity Penalty*: Used to instruct the next beam group to ignore the words/tokens already selected by previous groups.
464
 
465
 
466
+ ### 5. Contrastive search:
467
  Uses the entire input context to create more interesting outputs.
468
 
469
  *Penalty Alpha*: When α=0, search becomes greedy.
 
471
  Refer: https://huggingface.co/blog/introducing-csearch
472
 
473
 
474
+ ### Other parameters:
475
 
476
  - Length penalty: Used to force the model to meet the expected output length.
477