dialogmodel / README.md
jinymusim's picture
Update README.md
e8d1314
|
raw
history blame
1.22 kB
---
license: mit
datasets:
- daily_dialog
- multi_woz_v22
language:
- en
---
### Useless ChitChat Language Model
Basic Dialog Model from DialoGPT-small.
Finetuned on Dialog dataset. (Daily Dialog, MultiWoz)
### How to use
Use it as any torch python Language Model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("gpt")
model = AutoModelForCausalLM.from_pretrained("jinymusim/dialogmodel")
# Take user Input
user_utterance = input('USER> ')
user_utterance = user_utterance.strip()
tokenized_context = tokenizer.encode(user_utterance + tokenizer.eos_token, return_tensors='pt')
# generated a response, limit max_lenght to resonable size
out_response = model.generate(tokenized_context,
max_length=100,
num_beams=2,
no_repeat_ngram_size=2,
early_stopping=True,
pad_token_id=self.tokenizer.eos_token_id)
# Truncate User Input
decoded_response = self.tokenizer.decode(out_response[0], skip_special_tokens=True)[len(user_utterance):]
print(f'SYSTEM> {decoded_response}')
```