GPT2-Jokes / README.md
ameerazam08's picture
Create README.md
2076a90
|
raw
history blame
671 Bytes
metadata
tags:
  - gpt2
  - jokes
metrics:
  - accuracy
pipeline_tag: text-generation

gpt2-jokes

This model is a Fine-tune version of gpt2 on the Fraser/short-jokes dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6748
  • Accuracy: 0.8796

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure