File size: 3,944 Bytes
de4b817 dbafa58 59a2337 de4b817 ef0b9de 670b3fd ef0b9de dbafa58 ef0b9de bc68fbf dbafa58 7a16ef2 dbafa58 de4b817 ef0b9de de4b817 0034dc5 cc8b8ac 6bda6a9 de4b817 59a2337 de4b817 6bda6a9 de4b817 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 |
---
license:
- apache-2.0
tags:
- text generation
- emailgen
- email generation
- email
datasets:
- aeslc
- postbot/multi-emails-100k
widget:
- text: "Good Morning Professor Beans,
Hope you are doing well. I just wanted to reach out and ask if differential calculus will be on the exam"
example_title: "email to prof"
- text: "Hey <NAME>,\n\nThank you for signing up for my weekly newsletter. Before we get started, you'll have to confirm your email address."
example_title: "newsletter"
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and ask about office hours"
example_title: "office hours"
- text: "Greetings <NAME>,\n\nI hope you had a splendid evening at the Company sausage eating festival. I am reaching out because"
example_title: "festival"
- text: "Good Morning Harold,\n\nI was wondering when the next"
example_title: "event"
- text: "URGENT - I need the TPS reports"
example_title: "URGENT"
- text: "Hi Archibald,\n\nI hope this email finds you extremely well."
example_title: "emails that find you"
- text: "Hello there.\n\nI just wanted to reach out and check in to"
example_title: "checking in"
- text: "Hello <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if you've enjoyed your time with us"
example_title: "work well"
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if we could catch up"
example_title: "catch up"
- text: "I'm <NAME> and I just moved into the area and wanted to reach out and get some details on where I could get groceries and"
example_title: "grocery"
parameters:
min_length: 32
max_length: 128
no_repeat_ngram_size: 2
do_sample: True
temperature: 0.3
top_k: 20
top_p: 0.95
repetition_penalty: 3.5
length_penalty: 0.9
---
# gpt2-medium-emailgen
[![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/gist/pszemraj/70058788c6d4b430398c12ee8ba10602/minimal-demo-for-postbot-gpt2-medium-emailgen.ipynb
)
Why write the entire email when you can generate (most of) it?
```python
from transformers import pipeline
model_tag = "postbot/gpt2-medium-emailgen"
generator = pipeline(
'text-generation',
model=model_tag,
)
prompt = """
Hello,
Following up on the bubblegum shipment."""
result = generator(
prompt,
max_length=64,
do_sample=False,
early_stopping=True,
) # generate
print(result[0]['generated_text'])
```
## about
This model is a fine-tuned version of [gpt2-medium](https://huggingface.co/gpt2-medium) on the postbot/multi-emails-100k dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5840
## Model description
More information needed
## Intended uses & limitations
- this is intended as a tool to save time writing predictable emails and not to write emails without a human-in-the-loop. validate that your email is factually correct before sending it to others.
## Training and evaluation data
- the dataset is essentially a hand-curated/augmented expansion to the classic `aeslc` dataset
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.8701 | 1.0 | 789 | 1.8378 |
| 1.5065 | 2.0 | 1578 | 1.6176 |
| 1.1873 | 3.0 | 2367 | 1.5840 |
### Framework versions
- Transformers 4.22.2
- Pytorch 1.10.0+cu113
- Datasets 2.5.1
- Tokenizers 0.12.1
|