Update README.md
Browse files
README.md
CHANGED
@@ -2,23 +2,76 @@
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
-
|
6 |
-
-
|
7 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
---
|
9 |
|
10 |
-
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
-
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
-
# distilgpt2-
|
14 |
|
15 |
-
|
16 |
-
|
17 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
|
19 |
## Model description
|
20 |
|
21 |
-
|
|
|
|
|
|
|
|
|
22 |
|
23 |
## Intended uses & limitations
|
24 |
|
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
+
- email generation
|
6 |
+
- email
|
7 |
+
datasets:
|
8 |
+
- aeslc
|
9 |
+
- postbot/multi_emails
|
10 |
+
|
11 |
+
widget:
|
12 |
+
- text: "Hey <NAME>,\n\nThank you for signing up for my weekly newsletter. Before we get started, you'll have to confirm your email address."
|
13 |
+
example_title: "newsletter"
|
14 |
+
- text: "Hi <NAME>,\n\nI hope this email finds you well. Let me start by saying that I am a big fan of your work."
|
15 |
+
example_title: "fan"
|
16 |
+
- text: "Greetings <NAME>,\n\nI hope you had a splendid evening at the Company sausage eating festival. I am reaching out because"
|
17 |
+
example_title: "festival"
|
18 |
+
- text: "Good Morning <NAME>,\n\nI was just thinking to myself about how much I love creating value"
|
19 |
+
example_title: "value"
|
20 |
+
- text: "URGENT - I need the TPS reports"
|
21 |
+
example_title: "URGENT"
|
22 |
+
- text: "Hi <NAME>,\n\nI hope this email finds you extremely well."
|
23 |
+
example_title: "emails that find you"
|
24 |
+
|
25 |
+
parameters:
|
26 |
+
min_length: 4
|
27 |
+
max_length: 96
|
28 |
+
length_penalty: 0.7
|
29 |
+
no_repeat_ngram_size: 2
|
30 |
+
do_sample: False
|
31 |
+
num_beams: 4
|
32 |
+
early_stopping: True
|
33 |
+
repetition_penalty: 2.5
|
34 |
---
|
35 |
|
|
|
|
|
36 |
|
37 |
+
# distilgpt2-emailgen
|
38 |
|
39 |
+
Why write the rest of your email when you can generate it?
|
40 |
+
|
41 |
+
```python
|
42 |
+
from transformers import pipeline
|
43 |
+
|
44 |
+
model_tag = "pszemraj/gpt2-medium-email-generation"
|
45 |
+
generator = pipeline(
|
46 |
+
'text-generation',
|
47 |
+
model=model_tag,
|
48 |
+
use_fast=False,
|
49 |
+
do_sample=False,
|
50 |
+
early_stopping=True,
|
51 |
+
)
|
52 |
+
|
53 |
+
prompt = """
|
54 |
+
Hello,
|
55 |
+
|
56 |
+
Following up on the bubblegum shipment."""
|
57 |
+
|
58 |
+
generator(
|
59 |
+
prompt,
|
60 |
+
max_length=64,
|
61 |
+
) # generate
|
62 |
+
```
|
63 |
+
|
64 |
+
A script to use this on CPU/command line can be found [here](https://gist.github.com/pszemraj/c1b0a76445418b6bbddd5f9633d1bb7f) :)
|
65 |
+
|
66 |
+
> For this model, formatting matters. The results may be (significantly) different between the structure outlined above and `prompt = "Hey, just wanted to ..."` etc.
|
67 |
|
68 |
## Model description
|
69 |
|
70 |
+
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on a dataset of 50k emails, including the classic `aeslc` dataset.
|
71 |
+
|
72 |
+
It achieves the following results on the evaluation set:
|
73 |
+
- Loss: 2.6247
|
74 |
+
|
75 |
|
76 |
## Intended uses & limitations
|
77 |
|