pszemraj commited on
Commit
b435862
1 Parent(s): aa93588

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -72,18 +72,18 @@ inference:
72
 
73
  >_this is the "latest" version of the model that has been trained the longest, currently at 70k steps_
74
 
75
- - goal: a summarization model that 1) summarizes the source content accurately 2) _more important IMO_ produces summaries that are easy to read and understand (* cough * unlike arXiv * cough *)
76
- - this model attempts to help with that by using the [booksum](https://arxiv.org/abs/2105.08209) dataset to provide **explanatory summarization**
77
- - explanatory summary - A summary that both consolidates information and also explains why said consolidated information is important.
78
- - this model has been trained for seven epochs total (approx 70,000 steps) and is closer to finished.
79
- - Will continue to improve (slowly, now that it has been trained for a long time) based on any result findings/feedback.
80
  - starting checkpoint was `google/bigbird-pegasus-large-bigpatent`
81
 
82
  ---
83
 
84
  # example usage
85
 
86
- >An extended example, including a demo of batch summarization, is [here](https://colab.research.google.com/gist/pszemraj/2c8c0aecbcd4af6e9cbb51e195be10e2/bigbird-pegasus-large-booksum-20k-example.ipynb).
87
 
88
 
89
  - create the summarizer object:
 
72
 
73
  >_this is the "latest" version of the model that has been trained the longest, currently at 70k steps_
74
 
75
+ - **GOAL:** A summarization model that 1) summarizes the source content accurately 2) _more important IMO_ produces summaries that are easy to read and understand (* cough * unlike arXiv * cough *)
76
+ - This model attempts to help with that by using the [booksum](https://arxiv.org/abs/2105.08209) dataset to provide **explanatory summarization**
77
+ - Explanatory Summary - A summary that both consolidates information and also explains why said consolidated information is important.
78
+ - This model was trained for seven epochs total (approx 70,000 steps) and is closer to finished.
79
+ - Will continue to improve (slowly, now that it has been trained for a long time) based on any result findings/feedback.
80
  - starting checkpoint was `google/bigbird-pegasus-large-bigpatent`
81
 
82
  ---
83
 
84
  # example usage
85
 
86
+ > An extended example, including a demo of batch summarization, is [here](https://colab.research.google.com/gist/pszemraj/2c8c0aecbcd4af6e9cbb51e195be10e2/bigbird-pegasus-large-booksum-20k-example.ipynb).
87
 
88
 
89
  - create the summarizer object: